Top 10 Enterprises Using Artificial Intelligence

Published on
13/12/2019 07:38 AM

What is artificial intelligence? Generally speaking, people tend to define AI as computer systems that can mimic human intelligence.

If a computer system can demonstrate AI which is indistinguishable from human intelligence, it is said to have passed the “Turing test”, a benchmark named after the famous computer scientist Alan Turing.

Many companies are developing AI systems to give computers the ability to perform tasks such as visual perception, speech recognition and language translation.

If a computer could perform these tasks as proficiently as a human, it would open up entirely new markets for many businesses.

Here, we list 10 enterprises which are thought to be leading the way in developing AI systems.


Professor Nick Bostrom, the Oxford academic thought to be one of the foremost experts in AI, believes Google is the world’s leading AI company. Many would agree.

Google’s uncanny ability to find what you’re looking for with its search engine is, arguably, good evidence of its AI capabilities.

But the company has gone way beyond that now, with numerous acquisitions to bolster its AI offering.

A couple of years ago, Google bought AI startup DeepMind for $625 million. DeepMind has developed an algorithm which can play Go better than the best human player in the world.

Go is a board game that is said to be more complex than chess.

DeepMind’s algorithms can also perform a variety of visual perception tasks, and even conjure up images of generic objects and things it has seen – which is almost like imagination.

For example, say DeepMind has previously seen millions of cat images. Now, if you were to ask it to draw an image of a cat, it wouldn’t need to look at or copy any images of a cat, it can just create a new one.

Similarly, having played a lot of computer games, DeepMind can probably outplay the best gamers in the world.

In terms of enterprise applications, Google says it has used DeepMind to analyse its data centre operations. In doing so, DeepMind has found better ways of organising things which could save around a quarter of the energy costs.

As well as DeepMind, Google has bought many AI companies in the past few years. These companies specialise in things such as natural language processing, image recognition, and robotics.

Technically speaking, it was Alphabet – Google’s parent company – which made the acquisitions, but whatever it’s called, this company will probably be the one which has the most complete AI system this time next year.

Google’s CEO, Sundar Pichai, says the way we think of devices will change when AI is able to respond to us using voice and visual perception. It will be interesting to see what he means by that over the next few years.


One of the fundamental requirements to AI at the moment is massive processing power, and Nvidia is a past master at this, given that it has long been one of the leading producers of graphics processing units.

GPUs were previously mainly of interest to gamers for obvious reasons, but in the past couple of years, Nvidia has been doing a lot of business in the cryptocurrency sector.

Apparently, bitcoin mining takes a lot of computing resources and Nvidia GPUs are probably the gold pan of choice for this new breed of fortune hunters.

Another key market in which Nvidia has become dominant is the automotive sector.

Cars are becoming more computerised. They are increasingly being connected to the internet; the mechanical operations of the car itself are increasingly being monitored using sensors; and onboard computers are increasingly being given the ability to drive cars by themselves.

If you think the gaming world – with all its massively multiplayer games and all that whizz-bang activity – was frenetic, try to imagine how much data is generated by an average car.

While games are also entirely contained within the computer, cars, of course, operate in the real world, moving in three-dimensional space, and respond to real-world physics and events and signs and traffic lights and so on and so forth.

Real-world driving generates a colossal amount of data and Nvidia has launched an AI platform for carmakers and auto systems suppliers to be able to cope with it all.

Nvidia’s AI computer is likely to be featured in the majority of leading car companies.

Daimler Mercedes Benz

We could have picked any car company, but why not pick the oldest? Like all the leading automotive companies, Daimler is investing heavily in AI for the reasons outlined above.

Unsurprisingly, Daimler is likely to incorporate Nvidia technology in many of its new cars, but mostly in partnership with automotive component suppliers such as Bosch, which, in turn, has agreed a partnership with Nvidia.

Bosch says the onboard AI computer it has developed with Nvidia is more powerful than the human brain. It’s capable of 30 trillion floating-point operations per second – three times as many as a human brain, says Bosch.

Automakers are currently racing each other to develop cars which have ever higher levels of autonomy, as defined by the international engineers’ association, the SAE.

The SAE has created six categories of autonomy, starting at 0 for no autonomy and ending at 5 for full autonomy, or driverless.

By those standards, most cars on the road today are at 0 for no autonomy. They may have some electronics but no self-driving capabilities at all.

However, a very large number of relatively old vehicles have features such as cruise control and sophisticated braking systems, both of which could be said to features of autonomous cars.

Newer cars tend to have self-parking capabilities and autonomous emergency braking. These features are collectively referred to as “advanced driver assistance systems”, and depend on quite significant computing resources and artificial intelligence.

Most automakers are aiming to produce cars that attain level 3 or 4 autonomy which means that the cars can drive themselves in many circumstances, but the human driver will probably remain responsible overall.


IBM’s AI system has the memorable name of Watson. It’s the computing system which famously beat all human contestants on the popular US quiz show Jeopardy!

That game, however, was probably the least of its achievements since it was, essentially, a test of memory. And having devoured the entire contents of Wikipedia and numerous other sources, it was unlikely that Watson was going to lose.

But it’s still a good illustration of what AI does – it absorbs vast quantities of data, or big data, and uses it as raw material for returning usable and useful information.

Nowadays, Watson is mostly busy in the world of healthcare, analysing big data about patients to help diagnose various diseases. Real-world physics and biology are combining in the mind of Watson and it could end up performing many more of the functions that we traditionally associate with doctors.

Meanwhile, IBM is also developing a new generation of chips which are designed like the human brain, complete with neurons and synapses. These chips are not yet widely available. For now, the company is teaming up with Nvidia and others for its processing requirements.


Amazon, alone, one single company, is said to have scooped up more than 40 per cent of all online shopping orders in the US over the Christmas period this time round. And who’s to say next Christmas will be any different?

One of the main reasons for Amazon’s success is, arguably, the company’s dynamic website, which can display items closely related to ones you are clicking on, whether it’s based items others have bought or items that are similar to what you may have bought or put in your basket previously.

It seems like a simple thing, but Amazon’s success would seem to prove the effectiveness of this approach. It might not be regarded as true AI, but whatever it is, most other retailers would be happy to have just 1 per cent of it.

Lately, Amazon has been applying its AI capabilities to its Amazon Web Services offering.

The company is using AI to transcribe telephone conversations in realtime to analyse customer sentiment.

Another essentially simple application for an algorithm – although computers are still not perfect at transcribing human speech. But if anyone can make it work, it’s probably Amazon.


Salesforce last year launched an AI system called Einstein, after the famous scientist Albert Einstein of course.

The world’s leading customer relationship management platform says Einstein utilises deep learning to help marketers gain insights into their prospects and clients.

Deep learning is a branch of AI which, as the term suggests, analyses the deeper, underlying factors in many sets of data; it sort of reads between the lines as it were.

Machine learning is more immediate, sort of performs or mimics functions which it has been taught without necessarily analysing the underlying purpose or reasoning.

Salesforce has also acquired a startup called MetaMind, developer of an AI system which is said to be able to predict outcomes for language, vision and database tasks.

MetaMind’s system looks to have been incorporated into Salesforce’s Einstein.


Microsoft has taken the acquisitive approach to AI more so than other companies.

The company established a venture capital fund specifically for AI and backed an incubator called Element AI.

Meanwhile, one of its relatively old acquisitions, Skype, has developed an AI system which can not only transcribe conversations in realtime, it can even translate them on the fly.

It’s not the first time these things have been tried or done, but achieving adequate levels of accuracy will be key to any of this paying off.


Another of Microsoft’s investments, Facebook has been developing an AI system which can recognise an image and then describe it using voice – it’s said to be aimed at blind users of Facebook, and utilises neural networks.

Facebook is also developing a deep learning AI which can analyse the billions of interactions between Facebook users to try and figure out what matters to them.


Apple is probably less closely associated with AI because it’s often seen as a hardware company first and a software company second.

But to be fair to Apple, despite the massive price tags it places on its products, the company was the first among the tech giants to launch a voice assistant – Siri, which was actually an acquisition, not an in-house development.

Siri has now escaped the confines of the iPhone and iPad and found its way onto iMacs and MacBooks. And the company has launched a new initiative to develop more AI systems going forward.


James Dyson, the founder of Dyson, is, like, the Sir Clive Sinclair of today. Both of them are inventors, but whereas Sinclair’s commercial success was limited to his ahead-of-their-time computers, Dyson seems incapable of making a wrong move with any product he launches.
Dyson vacuum cleaners are generally regarded as the best in their product categories, and the company is developing robotic versions of the household appliances.
And while AI is critical to the proper functioning of autonomous robotic vacuum cleaners, Dyson is also looking to integrate AI into its entire enterprise – from research and design departments through to prototyping and production processes.