Malicious software, also known as malware, has become as ubiquitous as the computing devices that we use on a daily basis. There is no question that malware tends to be as sophisticated as its intended targets, but security researchers are now concerned about a new breed of malware that could attack smart software such as Apple’s Siri, Google Now, and Microsoft Cortana.
When a computer virus infects a system, it may stay in a dormant state until its code instructs it to execute and propagate; this is the most common mechanism that computer users in Apple Valley, Hesperia and Victorville are protected against when they install antivirus software in their desktops and laptops.
How is Malware Detected and Prevented?
The infection and execution mechanism described above has been used by hackers for many decades, long enough for computer security experts to develop protective measures against them. The key is to be able to detect a certain footprint so that the antivirus software can remove the malware and repair the system.
Machine learning and artificial intelligence are used to a certain extent when developing antivirus software and Internet security suites. Based on prior malware detection, a security program can apply an algorithm that will help it learn malware variations.
New Vulnerabilities for Malware to Attack?
The latest attack vector is related to the growth of artificial intelligence and its fast rate of adoption. Smart personal assistants such as Siri, Cortana and Google Now are now installed on smartphones and tablets around the world, and they are constantly listening for input and verbal commands.
Technology researchers believe that smart personal assistants could be easily duped by a mix of white noise and human voice snippets, which could direct the smart app to perform an action such as emailing contact lists to an unknown address, open up Bluetooth ports for listening, pulling up the URL of a rogue website to deliver malicious code or trick the device into joining a botnet.
Security analysts at Google are currently looking into ways of securing future versions of Google Assistant, a smart personal assistant that is often used to handle personal information from smartphone and tablet users. Until then, it could be advisable to not allow smart assistants to listen continuously.