Computing innovation refers to the creation and application of novel ideas, technologies, and approaches within the field of computing. It involves developing new algorithms, software systems, hardware architectures, and computational methodologies to solve complex problems, improve efficiency, and drive advancements in various domains
. Computing innovation can take various forms, such as software development, new hardware designs, algorithms, technology-based platforms, cloud computing, the Internet, automation, big data analytics, artificial intelligence, and the blockchain
. It plays a vital role in driving technological advancement, problem-solving, digital transformation, economic growth, and competitiveness
. Some examples of computing innovations include social media, citizen science, smart sensors, and technologies directly attributable to computer science such as the Internet, broadband, PC/laptop computers, mobile phones, DNA testing and sequencing, and microprocessors