How does Amazon use machine learning

Artificial intelligence (AI) is the area of ​​computer science that deals with the acquisition of cognitive skills, which are usually assigned to human intelligence. These include learning, problem solving and pattern recognition. Artificial intelligence, in English "Artificial Intelligence" and often abbreviated to "AI", is often associated with robots or futuristic scenes. However, AI goes far beyond the automatons of science fiction and also enriches the non-fiction of modern computer science. Professor Pedro Domingos, a well-known researcher in this field, profiles "five tribes" of machine learning: symbolism based on logic and philosophy, connectionism derived from neuroscience, evolutionism close to evolutionary biology, statistics and probability Bayesianism, and Bayesianism analogism based on psychology. New, more efficient methods of statistical computation have brought Bayesians advances in various areas in recent years, which are collectively referred to as "machine learning". Similarly, advances in network computing in the area of ​​connectionism have led to an advancement in a sub-domain called "deep learning". Machine learning (ML) and deep learning (DL) are both areas of computer science that are linked to the discipline of artificial intelligence.

In general, these techniques are divided into "supervised" and "unsupervised" techniques, where the supervised techniques also use training data with the desired output, while the unsupervised techniques lack the desired output.

AI becomes more intelligent and learns faster the more data is available. The fuel that fuels machine and deep learning solutions, however, is generated every day by companies, whether the data is extracted from data warehouses such as Amazon Redshift or checked statistically through the "power of the masses" with Mechanical Turk or dynamically using Data mining can be siphoned off through Kinesis Streams. With the creation of the Internet of Things (IoT), sensor technology also makes an exponential contribution to the amount of analyzed data - data that is now gushing from previously virtually untouched sources, locations, objects and events.

“Machine learning” generally refers to a series of Bayesian pattern recognition and pattern learning techniques. At its core, machine learning consists of a collection of algorithms that learn from recorded data and make predictions, optimize a certain utility function whose conditions are still uncertain, extract hidden structures from data and classify data into precise descriptions. Machine learning is often used when explicit programming is too rigid or impractical. In contrast to normal computer code, which software developers intend to generate program code-specific output based on a certain input, machine learning uses data to generate statistical code (an ML model) that returns the "correct result" based on the pattern, that was recognized from previous input examples (and in the case of monitored techniques also from the output). The accuracy of an ML model depends primarily on the quality and quantity of the historical data.

With the right data, an ML model can analyze high-dimensional problems with billions of examples and find the optimal function that accurately predicts output for a given input. As a rule, ML models can provide statistical confidence for their forecasts as well as for the overall performance. Evaluation scores of this type are important in deciding whether to use an ML model or a single forecast.

Amazon.com bases a large part of its business on ML-based systems. Without ML, Amazon.com could neither expand its business nor improve the personal experience for its customers or its own customer exploration, nor optimize its logistical speed and quality. With AWS, Amazon.com wants other companies to share the same IT infrastructure - and its agility and cost benefits - and it is now continuing that process by democratically putting ML technologies in the hands of every business.

The structure of the Amazon.com development team, as well as the focus on ML for solving tough, pragmatic business problems, has been a catalyst for Amazon.com and AWS to create easy-to-use and powerful MT tools and services. Like other IT services, these tools are first tested on Amazon.com's own big data in the mission-critical environment before they are released to customers as AWS services.

Machine learning is often used to predict future outcomes based on historical data. Organizations use machine learning, for example, to forecast the sales of their products in the coming fiscal quarters based on a specific demographic group. Or they estimate which customer profiles are most likely to be dissatisfied with their own services or, conversely, to remain most loyal to their own brand. Such forecasts enable better business decisions as well as more personal customer experiences and are therefore a great potential for reducing customer loyalty costs. Complementary to Business Intelligence (BI), which relates to past business data, ML predicts future results based on past trends and transactions.

Various components contribute to the successful implementation of an ML solution in a company. First of all, it is important to correctly identify the problem - that is, the forecast that the business would derive the most benefit from if it turned out as forecast. Then the data must be recorded based on historical business figures (transactions, sales, customer churn, etc.). An ML model can only be developed on the basis of this data. The ML model is then executed and the forecast output of the model is fed back into the business system so that more informed decisions can be made.

Implementation of machine learning in your company

Detection of products, events or observations that do not match an expected pattern or other elements of a data set.

Development of forecast models that help identify potentially fraudulent sales transactions or damaging product reviews.

Identification of customers who are at high risk of churn, which requires proactive measures to strengthen customer loyalty, for example in the form of special offers or customer service follow-ups.

Providing an optimally personalized customer experience with the help of predictive analysis models that suggest products or optimize the website output based on previous customer actions.

Deep learning is a sub-area of ​​machine learning in which algorithms are layered in intermediate layers in order to be able to analyze and understand the data even better. These algorithms don't just generate an explainable set of relationships, as would a more basic type of regression. Rather, deep learning uses these layers of non-linear algorithms to create distributed representations that interact with one another on the basis of various factors. With large amounts of training data, the deep learning algorithms begin to recognize relationships between the individual elements. These relationships can be based on a wide variety of characteristics, be it shapes, colors, words or other properties. The system can make forecasts based on the identified relationships. Within the domain of machine learning and artificial intelligence, the strength of deep learning is based on the system's ability to recognize far more relationships than humans would be able to code in software, often also because of these relationships are imperceptible to the human brain. After sufficient conditioning through training, this network of algorithms can also interpret extremely complex data and make forecasts from it.

Convolutional neural networks are far more powerful than the human brain for many vision tasks, including object classification. When the deep learning algorithm system has had the opportunity to condition itself on millions of labeled images, it eventually begins to recognize the subject of an image on its own. It is therefore not surprising that many photo storage services make use of facial recognition driven by deep learning. This feature is also a core element in Amazon Rekognition, Amazon Prime Photos and Amazon's Firefly Service.

Amazon Alexa and other virtual assistants are designed to recognize a request and answer it correctly. Although humans can understand language from an early age, understanding and answering human language is still a relatively new field for computers. The various accents and language patterns of human language are particularly difficult for a machine to understand, so that a much greater amount of mathematical and computer science is required than for conventional arithmetic tasks. Thanks to deep learning, the algorithm system is increasingly able to recognize language and interpret its message.

The field of natural language processing seeks to teach the system to understand human language, also taking into account the tone of voice and context. The algorithm system begins to interpret complex concepts such as emotion and sarcasm more and more correctly. This area of ​​computer science is a growth factor as companies try more and more to automate their customer services through speech recognition and text robots, as they are already used in Amazon Lex.

Online shopping often offers personalized content recommendations on products, films or news that might interest the user. In the infancy of online shopping, these systems were fed by flesh and blood people who linked items together. With the advent of big data and deep learning, people are no longer required for these tasks. Today, algorithms identify the products that might interest you by analyzing your past purchases or product searches and comparing this information with the purchases of other users.