Voice Recognition Security Isn’t Hack Proof!

Voice Recognition Security Isn’t Hack Proof!

Our voice is unique, similar to our DNA, Fingerprints, eyeballs, etc., but that does not imply that it can never be spoofed. Hackers have started attacking unconventional methods which have reputation of being extremely safe. One of them is voice recognition! They know passwords are being protected and thus their new target is multi-factor authentication.

If we take warnings from Federal Communications Commission (FCC) and Better Business Bureau (BBB) seriously, then your voice is not fully hack proof. They issued this warning after spam callers were identified in January 2017. Those spam callers used several techniques and convinced users to say the word “yes.” They recorded and used it for credit card frauds or utility charges.

Voice Recognition Security
                                                          Source:fortune.com

 

Thus, if you think replacing passwords with biometrics is safe, then you are certainly wrong. There have been instances where fingerprints were cloned, and hacking became extremely easy. Now, same is happening with voice commands as well! Hackers are on a spree to clone your voice, for the obvious reasons. One of the most widely used application of speech recognition is in smartphones.

In this, you use your voice to unlock your phone which is no less than a gold mine for hackers. You save everything in your smartphones, be it your card details, passwords of various accounts or your personal pictures or videos. Hackers just need some audio clips in your voice so that they can modify it into something which is useful for them!

Do Not Take It As Hoax

There is substantial evidence that prove voice recognition security is at verge of getting compromised. White hat hackers have depicted this through Proof of Concept in a few cases. Recently, a group of researchers from University of Montreal’s Institute for Learning Algorithms laboratory, have announced that soon they will be able to mimic any voice completely just by using recorded audios. Also, according to Scientific American, there is a technology, lyrebird technology. This relies on neural networks powered by AI and deep learning techniques; it can convert small bits of sound to speech. It won’t be long from now when these will be used against uses across the globe!

Voice Recognition Security
                                                      Source: information-age.com

 

Must Read : Common Cryptocurrency Scams & How to Stay Safe

Threats That Will Come Along Compromised Voice Recognition Security

If we take an insight into the threats, then we’d find that there are too many of them! Let’s take a few examples to prove the same!

Voice ID Theft

It turns out that even though too many threats and vulnerabilities are associated with voice recognition, to clone someone’s voice is as easy as snapping fingers! There are several tools as well which were designed for creative and legitimate tasks, but they are equally available for people with malicious intents as well! Voice cloning will severely disturb one’s financial account. You ask how? Well many banks are offering voice verification and people are opting for it!

Smart Speaker Hacks

Recently, white noise attacks on Smart assistants made it clear that any device controlled with voice can be compromised with hidden noise/commands. Hackers can make your smart assistant do things merely to disturb you such as play loud music at odd hours! Imagine the consequences when these smart assistants have control of your home security system! We definitely don’t want any of that to happen!

Deepfake Voice

This is not just source behind fake news but is also considered to be the culprit for fake celebrity videos. In those clips, even the audios seemed legit! You ask how? With the use of AI powered Deepfake voice. Same technique can be used against commoners as well. If creating fake voices becomes so easy, then this mode of verification will fall off the cliff!

Voice Recognition Security
                                               Source: theinstitute.ieee.org

Also Read : Vulnerabilities That Intimidate Your Device Security

The Final Verdict

With further technological advancements, there are corresponding criminal opportunities for people with malicious intent. Voice commands and speech-controlled gadgets will be no exception. Thus, it is crucial that you take effective measures to secure yourself! Voice cloning is definitely not easy to prevent and stop from spreading. Moreover, there is still nothing guarding us against this type of attack and thus we’d recommend you follow these:

  • Don’t sign up for voiceprint login into your financial accounts.
  • Don’t connect your smart speakers with home security system.
  • Be vary of what commands you give to your personal assistants like Cortana.
  • Make sure you run an internal audit (for organizations) and scan your systems (personal use) to make sure you are shielded.

There is no sure shot way to combat this, but we can always use our wisdom and take basic precautions. This will ensure that we are not next victim of compromised voice recognition security! What are your views on this?

Quick Reaction:

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe & be the first to know!

Signup for your newsletter and never miss out on any tech update.