What is an artificial intelligence?
Although we often hear about them, artificial intelligences are rather misunderstood by the majority of people. Let us try and explain what they are and where they come from. The idea of an artificial intelligence was first presented by a team of American scientists in 1956 during the Dartmouth Summer Research Project on Artificial Intelligence [wikipedia]. Two main points came out of this presentation:
- It was believed that computers would eventually come to think autonomously, as humans do, and that they would be able to solve problems just as, if not more efficiently. This is in fact what was first referred to as artificial intelligence.
- The creation of a functional artificial intelligence would be relatively simple; according to the team, such a project would take the summer to a dozen scientists.
Whereas the optimism of these scientists was less than realistic, we must still commend their work and recognize that, as a result of said work, immeasurable advances have been made in the field.
To the skies
2012 is a landmark year for the development of artificial intelligences. Sometime during the month of November, a team of researchers from the University of Toronto, led by Professor Geoffrey Hinton, programmed an artificial intelligence that managed to successfully identify some 85% of the images it was shown. Comparatively, the average human can identify 95% of images in such tests, whereas other artificial intelligences had never passed the 72% mark before.
The refinement of learning algorithms, known as deep learning, was credited for this major breakthrough. This particular algorithm works by attributing numbers to the different shapes and colors that form individual images. This data is then broken down into manageable chunks, known as layers, analyzed, and statistically compared to layers of existing data. The statistical correlation between new and existing data is what allows the algorithm to identify images. Human intervention only occurs when comes the time to verify whether the images were successfully identified. Should that be the case, the created statistical correlations are added to the algorithm’s database for future uses.
In 2015, the Toronto team broke its own record with a successful image identification rate of 96%, thus fulfilling the rise of artificial intelligences that was prophesized some 70 years ago. Although the further refinements of learning algorithms were once again credited for these improvements, the exponential development of computer power was also deemed a major factor in this victory. It is in fact what nowadays allows us to obtain precise results in real time, whereas such results once required weeks of work to obtain.
Artificial intelligence vs. business expectations
Forward-thinking businesses have been trying to, and in some cases, successfully managed to find ways of integrating deep learning-like algorithms into their software. Not unexpectedly, major players of the computer industry Microsoft and Google are keeping ahead of the curve; several of their most important products already exploit artificial intelligences in one way or another. Cortana is the name of Microsoft’s latest creation, a “digital agent” included in the company’s most recent versions of its well-known operating system. This agent can, amongst other things, entertain a live conversation, help with file management, or perform web searches all whilst evolving and adapting to its individual users. On their side of things, Google has been using deep learning algorithms to tailor their search engine results to user searching trends.
It is worth noting that although many are affected by artificial intelligences on a daily basis, few are actually aware of their presence. Those who feel uneasy about this will have to get used to the idea, too: businesses and agencies in many different sectors – ranging from financial to health to manufacturing – have already shown interest in exploiting artificial intelligences. Large companies have been buying out smaller companies specialized in the development of artificial intelligence to better integrate the technology into their services and wares. Artificial intelligences are seen as particularly valuable for their ability to reduce employee costs and human mistakes. The Markets and Markets consulting company predicts the value of the artificial intelligence sector to rise up to some 5 billion American dollars within four years, whereas the Merrill Lynch Bank of America suggests that artificial intelligences would help businesses save between 14 and 33 trillion American dollars worldwide in 2025, mostly with regards to employee costs. Such savings are in fact the main reason for 62% of businesses to adopt some form of artificial intelligence-based technology.
Despite major progress in the field, artificial intelligences are still subject to certain limitations. Although they may be very efficient – much more so than humans – for repetitive and numbers-based tasks, they display certain weaknesses when faced with cognitive tasks, such as speech comprehension. The efficacy of artificial intelligences is also highly dependent upon the quantity, quality, and organization of data they are fed. Furthermore, computers that are geared to handle such technologies are particularly energy-intensive whereas the human brain is not. That said, most of these limitations are only minor, temporary setbacks; in 2014 only, 13.8 million dollars were estimated to have been invested into the adoption of artificial intelligences, the billions invested yearly by venture capital businesses notwithstanding.
What about your business?
It is clear that the use of artificial intelligences will reduce your operating expenses. They’re also pretty useful when it comes to decision-making, as they can easily recognize complex trends for you. It is no longer a question of whether or not businesses should embrace such technology, but rather, of when they’ll actually take the plunge. Are you ready? The three following steps should help you become so.
First, you must begin by cultivating a genuine appreciation for artificial intelligences. The results of a survey were recently revealed by the Narrative Science research firm: whereas only 38% of surveyed people thought they were using products based on artificial intelligences, 88% of them actually were. How can this 50% disparity be explained? Simple, really: lack of knowledge in regards to artificial intelligences.
Second, you need to figure out exactly how artificial intelligences might help with your daily business or management activities. Are you trying to improve your production level, cut down on staff, or improve your algorithms? The possibilities are infinite, and it is necessary that you understand your needs.
Third, you should see to the proper management of your data, especially once you’ve adopted an artificial intelligence. As stated earlier, such systems become particularly efficient when data are well sorted and organized. This sorting could – and probably should – even be done before integrating an artificial intelligence, so as to allow you to then quickly draw interesting conclusions from trends that affect your business and clients. Although artificial intelligence technologies are not accessible to all, it should be mentioned that there are alternatives.
Cimpl, our IT and telecom expense management platform, allows you to oversee and sort out the entirety of your assets. It was specifically designed to save you time and reduce your IT business costs through actionable data anlytics. Putting such a software to use may very well pave the way to the adoption of an artificial intelligence by helping you better with your it cost management and reduce your data and IT expenses. Our website has all the information you need to get started with our Cimpl IT asset management platform and services. Come and visit!