whitepapervault.com
ElectronicsInvestmentLearningMachine LearningMalwareMedicalSoftware

This is an open access paper under the license I C E M I T

  • 1st International Scientific Conference on Economy, Management
  • and Information Technologies – ICEMIT 2023 Recent trends in computing & information technology Denis Jelaš a , Damir Debak a , Pavle Sviličić a a Aspira University of Applied Sciences, Croatia A r t i c l e i n f o A b s t r a c t Review paper DOI: https://doi.org/10.46793/ICEMIT23.307J UDC/ UDK:

  • 005.332.2:004
  • This document presents the recent trends in computing and information technology and their expected development in the next period. The following trends are covered: artificial intelligence and machine learning, quantum computing, blockchain, cybersecurity, edge computing, robotic process automation (RPA), virtual reality and augmented reality and internet of things. The general idea is to describe each of the trends and highlight the predictions for the coming period. The aim of the work is to highlight the expected trends in the development of computing and information technology, their mutual relationships and the impact on different spheres of development and people's lives. We live in a rapidly changing time, and it is not easy to predict what awaits us, and what are the possible directions of development, but it is certain that it will be very exciting. Keywords: IT trends, computer trends, computing trends, technology trends.

  • 1. Introduction
  • This document presents selected trends in computing and information technology and their expected development in the coming period, although there are some other trends that have not been covered. Those trends that are the most representative for computing and information technology and for which the fastest development is expected in the coming period are covered. The following trends are presented: artificial intelligence and machine learning, quantum computing, blockchain, cybersecurity, edge computing, robotic process automation (RPA), virtual reality and augmented reality and internet of things. Each chapter describes the meaning of each of the trends, the technology it uses and the relationships with other technologies and trends. At the end, the expectations of future development as well as potential difficulties or obstacles are presented. A trend is a general direction in which something is developing or changing, and the purpose of the document is to analyze trends in computing and information technology with the idea of trying to predict the direction in which future development is going and what the expectations are in each of the trends. Not every trend can be isolated and observed separately from the others, therefore the final chapter will present the joint effect of all observed trends in computing and information technology, whereby an additional synergistic effect is often achieved.

  • 2. Artificial Intelligence (AI) and Machine Learning (ML)
  • Artificial intelligence (AI) is the perceptive, synthesizing, and reasoning intelligence exhibited by computers or machines, as opposed to the intelligence exhibited by animals and humans. Examples of tasks where this is done include speech recognition, computer vision, translation between (natural) languages, driverless car control, robotic vacuum cleaners and chatbots. Artificial intelligence is based on the principle that human intelligence can be defined in such a way that a machine can easily imitate it and perform tasks, from the simplest to the more complex ones. The goals of artificial intelligence include mimicking human cognitive activity.  Corresponding author E-mail address: denis.jelas@aspira.hr Damir Jelaš, Damir Debak, Pavle Sviličić

  • 1st International Scientific Conference on Economy, Management and Information Technologies – ICEMIT 2023 308
  • Two important issues will determine the further development of artificial intelligence: ethical dilemma and legislation. Example of ethical dilemmas can be found with autonomous vehicles in the case of a fatal incident, who is to blame for the accident (Dilmegani, 2023). Most countries do not yet have developed legislation that deals with such cases. There have been discussions about the ethics of using artificial intelligence in weaponry in the military, especially in the use of drones. From an ethical and legislative perspective, the important question is whether governments or private companies are misusing AI technology or using it legally. Surveillance methods that threaten human rights and privacy must not be allowed. Image 1. Introducing ChatGPT Source: Unplash (n.d.). Introducing ChatGPT. Unplash. https://unsplash.com/s/photos/introducing-chatgpt AI is not sufficiently transparent and neutral, and decisions are not always understandable to humans. Decisions based on artificial intelligence are susceptible to inaccuracies, discriminatory outcomes, built-in or embedded biases. Artificial intelligence defines a machine that can mimic human intelligence while machine learning aims to teach a machine how to perform a specific task and produce accurate results by identifying patterns. Machine learning is a branch of the broader field of artificial intelligence that uses statistical models to develop predictions. It is traditionally defined as the ability of a computer to learn without being explicitly programmed to do so. Examples of machine learnings domains included weather forecasting, medical diagnosis, aerospace, facial recognition, stock market, social media, signature verification, forensics, robotics, electronics hardware, defense, and seismic data gathering (Goell et al., 2022). Conversational AI Systems will become more advanced, for example chatbots are yet to become as efficient as needed to answer beyond simple queries requiring a set pattern. A most recent example is ChatGPT. “ChatGPT is an AI chatbot that uses natural language processing to create humanlike conversational dialogue. The language model can respond to questions and compose various written content, including articles, social media posts, essays, code and emails.” (Hetler,

  • 2023).
  • ChatGPT uses deep learning, part of machine learning, to generate human text through transformative neural networks. It predicts the text, including the next word, sentence or paragraph, based on the typical sequence of data it has learned earlier during the preparatory learning process. There is already an impact of ChatGPT on the job market and jobs that require programming and writing skills are at risk. Jobs such as web and digital interface designers, software developers, journalists, tax preparers, writers, mathematicians and blockchain engineers are most at risk of being replaced by artificial intelligence. More and more enterprises are considering merging predictive analytics with artificial intelligence trends in 2023. This will help them achieve more accurate and timely forecasting of business decisions. AI and IOT may be separate concepts, but we are expecting a trend of combining both technologies to change almost everything in the way we live, including the way we do business. Fewer incidents of security breaches are expected with the increasing use of artificial intelligence in cybersecurity. The need to process huge amounts of complex databases in quantum computing will result in the merging of quantum computing and AI. RPA (robot process automation) technology increasingly uses advanced artificial intelligence skills in the form of machine learning, natural language processing and image recognition, which will make it more powerful in terms of handling the cognitive processes of various applications.

  • 3. Quantum Computing
  • “Quantum computing is a rapidly emerging technology that harnesses the laws of quantum mechanics to solve problems too complex for classical computers” (IBM, 2023). Quantum computers are very different from classical computers that have existed for more than half a century and are used in situations of high complexity that classical computers cannot solve. These computers with specific quantum algorithms take a new approach to solve complex problems — creating multidimensional computational spaces. Recent trends in computing & information technology Toplica Academy of Applied Studies, Department of Business Studies Blace 309 Their processors must be very cold – close to absolute zero – to maintain their quantum states. To achieve this, supercooled superfluids are used. At ultra-low temperatures, some materials show an important quantum-mechanical effect: electrons move through them without resistance, and we call such materials superconductors. In practice, it is not easy to ensure temperatures close to absolute, and therefore quantum computers require special hardware design. In classical computers, bits can be in two states (which are usually denoted by 0 and 1), while in quantum computers we have qubits that can be in both states at the same time. This mode of operation enables much higher speeds than today's computers. Quantum computers are a new step forward and even greater progress is expected in the next few years, as large amounts of money are invested in the discovery of new superconducting materials. Discoveries in quantum physics provide a wide range of new theoretical hardware capabilities that fuel optimism about quantum computing, but a growing understanding of quantum computing limitations such as quantum errors and loss of qubit content due to the slightest environmental changes, balances this optimism. Image 2. AI Generated Quantum Computer Source: Pixabay (n.d.). AI Generated Quantum Computer. Pixabay. https://pixabay.com/images/search/ai%20generated%20quantum%20computer/

  • 4. Blockchain
  • Blockchain is a decentralized, distributed and public digital ledger that allows transactions to be recorded on a large number of computers so that the record cannot be retroactively changed without changing all subsequent blocks and network consensus (Hayes, 2023). This technology enables the existence of cryptocurrencies by providing the fundamental principles by which cryptocurrencies function in the first place. Without the concept of blockchain, there would be no bitcoin cryptocurrency, and no other newer cryptocurrency. With the help of this technology, the problem of creating a distributed database is solved, without the need to use a special entity that will monitor transactions. In classic banking transactions between two users, the bank plays the role of supervisor and recorder of transactions and ensures that one user will not deliberately defraud the other. Blockchain provides an alternative to such a classical system by eliminating the bank as a centralized institution to be trusted and replacing it with a decentralized network of computers that confirm transactions based on a specific algorithm. Anyone who wants to earn, be rewarded, or "mine" bitcoins or some other cryptocurrency that is "mined" can be included in this decentralized network. Image 3. Bitcoin Source: Unplash (n.d.). Bitcon. Unplash. https://unsplash.com/s/photos/bitcoin Blockchain network for the bitcoin cryptocurrency is made up of users and "miners". Users need miners to confirm or record transactions, and miners rely on users because they create transactions at which they can earn cryptocurrencies. Damir Jelaš, Damir Debak, Pavle Sviličić

  • 1st International Scientific Conference on Economy, Management and Information Technologies – ICEMIT 2023 310
  • Transaction confirmation system performs demanding mathematical calculations, which is done by miners, using a system such as the Proof of Work (POW) or Proof of Stake (POS). POW is an original principle that initially resolved the problem of credibility of a distributed record ledger. The lack of POW system is that it consumes extremely much energy and resources due to competition in the speed of solving a crypto logical task. POS, on the other hand, works on the principle of investing existing cryptocurrencies, making the user a chance to participate in the verification of transactions. Customers who have invested cryptocurrencies participate in the maintenance of the network and earn from collecting a transaction fee paid by users who initiate transfer to the other user. One of Blockchain's biggest trends in the near future is the growth of business operations used by blockchain. The decentralized Blockchain structure offers better safety, transparency, and protection against cyber-attacks, which is why more and more companies will use this technology to their advantage. As we enter the third decade of blockchain, it is no longer a question of whether this technology will be adopted – it is a question of how soon. Today we see tokenization of assets, bitcoin ATMs, acceptance of bitcoins as an official currency in some countries and more and more cryptocurrency transactions. As a result, the next decades will prove to be a significant growth period for blockchain.

  • 5. Cybersecurity
  • Cybersecurity includes the application of various technologies to protect systems, computer networks, programs, devices and data from threats and attacks that come via the Internet, or the so-called cyber space. We are trying to protect ourselves from attacks by criminals who try to compromise our systems, networks and data in order to take control or steal data. “The multi-billion-dollar information and communication technologies (ICT) security market is one of the fastest growing in the world. The ICT security field is a highly complex cross-disciplinary domain that includes computer science, management science, software engineering, information systems, network management, policy making, and management of infrastructures” (Ahgar et al., 2014). We store data on computers and other devices connected to the Internet, and a large amount of this data is sensitive data such as passwords or financial data. If cybercriminals get access to this data, they can use it to share sensitive information, use passwords to steal financial resources, or change data to their advantage. Cybersecurity is very important to companies in order to protect their data, systems and intellectual property. Some of the threats are DDOS attack, malware, adware, botnets, ransomware, spyware, trojan horse, phishing, social engineering and SQL Injection. Image 4. Protect your computer from cyber attacks Source: Unplash (n.d.). Protect your computer from cyber attacks. Unplash. https://unsplash.com/s/photos/Protect-your-computer- from-cyber-attacks “Networks have become highly vulnerable to cyberattacks with a rapid increase in technology reliance, global connectivity and cloud usage. Additionally, the COVID-19 pandemic has caused a paradigm shift to online infrastructure and remote working, resulting in more cybercrimes.” (Das, 2023). There are several cybersecurity trends that organizations and individuals need to watch out for, such as ransomware attacks, threats to the healthcare sector, artificial intelligence (AI)-assisted cyberattacks, exploiting IoT vulnerabilities, focusing on users as an attack surface, attacks against cloud services, multi-factor authentication, and quantum cryptography threats (Khawaja, 2023). Cybercriminals are already using AI and machine learning tools to attack and investigate the networks of small businesses and organizations that cannot afford significant investments in defensive cybersecurity technology. Extortion by hackers using ransomware and demanding payment in cryptocurrencies is becoming a growing threat.

    Related posts

    The Enterprise Guide To Ai Powered Devsecops

    Continuous Integration And Continuous Delivery For 5g Networks On Aws Aws Whitepaper

    Aws Cloud Adoption Framework For Artificial Intelligence Machine Learning And Generative Ai Aws Whitepaper

    Leave a Comment