Top 10 technology trends in data processing and analytics in 2019 according to Gartner
Good evening. A translation of the following article has been prepared especially for students of the BI Analyst course . Have a good reading.
The focus of the Gartner Data & Analytics Summit on February 18-19 in Sydney was Augmented Analytics and Artificial Intelligence.
Advanced analytics , continuous intelligence (continuous intelligence) and explainable artificial intelligence (explainable artificial intelligence) are one of the biggest in the field of processing technology trends in data and analytics , which will have a destructive potential in the next 3-5 years, according to Gartner, Inc.
Speaking at the Gartner Data & Analytics Summit in Sydney, Rita SallamGartner’s vice president of research noted that data and analytics leaders should examine the potential impact of these trends on businesses and adjust their business models and operations accordingly, otherwise they risk losing competitive advantage over those who devote it enough attention.
“The history of analytics data processing continues to evolve, from supporting internal decision making to continuous intelligence, information products, and hiring data professionals"Said Rita Sallam. “It is very important to get a deeper understanding of the technological trends that underlie the creation and development of this story, as well as to set certain priorities for them, depending on the value for a particular business.”
According to Donald Feinberg , vice president and distinguished analyst at Gartner, the main problem caused by digital malfunction (too much data) has also opened up an unprecedented opportunity. The huge amount of data, coupled with the growing power of processing tools provided by cloud technologies, gives a clear understanding that it is now possible to train and execute algorithms on a large scale, necessary for the full realization of the potential of AI.
“The size, complexity, distributed nature of the data, the speed of work and the continuous intelligence necessary for the digital business make it clear that rigid and centralized architectures and tools can no longer cope,” says Feinberg. “The continued survival of any business will depend on a flexible, data-driven architecture that meets the ever-growing pace of change.”
Gartner recommends that leaders in the field of data processing and analytics discuss with business representatives the main priorities of the company and think about how they can integrate the following trends into their work.
Advanced analytics is the next wave of breakthroughs in the data processing and analytics market. It uses machine learning and artificial intelligence technologies to transform the development, consumption, and sharing of analytic content.
By 2020, advanced analytics will be the main engine of new purchases in analytics and BI, as well as Data Science, ML platforms and embedded analytics. Leaders in data processing and analytics are required to plan for the implementation of advanced analytics as the platform develops.
Augmented Data Management technology uses ML capabilities and AI mechanisms to create categories of company information management, including data quality, metadata management, master data management, their integration, as well as self-tuning and self-tuning of database management systems (DBMS) . It automates many tasks and allows less skilled users to use the data themselves. In this way, highly qualified technicians can focus on more important tasks.
Advanced data management transforms metadatafrom those used only for audit, pedigree and reporting, eventually supplying them to dynamic systems. Metadata changes from passive to active and becomes the main engine for all AI / ML.
By the end of 2022, the number of tasks performed manually in the field of data management will decrease by 45% due to the introduction of machine learning and automated management of the level of service.
By 2022, more than half of the new large business systems will use continuous intelligence, which in turn uses real-time contextual data to improve solutions.
Continuous intelligence is a design pattern in which real-time analytics integrates into business operations, processing current and historical data to propose actions in response to an event. It provides automation or decision support. Continuous Intelligence uses several technologies, such as advanced analytics, event flow processing, optimization, business rule management, and machine learning.
“Continuous intelligence is a major innovation in the work of data and analytics teams,” says Sallam. “This is a daunting task and a great opportunity for teams of analysts and BI experts to help companies make smarter decisions in real time as early as 2019. It can be considered as the final version of operational BI. ”
AI models are most often used to improve or completely replace a person in decision-making. However, in some scenarios, companies must justify how these models come to specific decisions. To build user or stakeholder confidence, application architects must make these models more understandable and understandable.
Unfortunately, most advanced AI models are complex black boxes that fail to explain how they came up with a specific recommendation or solution. Explained AI in data science and ML platforms, for example, automatically generates an explanation of models in terms of accuracy, attributes, statistics of models and functions in a natural language.
Graph analytics is a set of analytical methods that allow you to explore the relationships between objects of interest, such as organizations, people, and transactions.
The use of graphical processing and graphical DBMSs will increase by 100% every year until 2022, which will accelerate data preparation and provide a more complex and adaptive data science.
Graphic data warehouses can efficiently model, explore and query data with complex relationships between data warehouses, but the need for specialized skills to work with them is their main limitation today.
Graphical analytics will grow steadily over the next few years, as there is a need to ask complex questions to complex data, which is not always practical or at least feasible on a scale where SQL queries can be used.
The data fabric provides unhindered access to and sharing of data in a distributed data environment. It is a single and consistent data management framework that provides unhindered access to data and the possibility of architectural processing in any other storage.
Until 2022, custom data fabric designs will be deployed primarily as a static infrastructure, forcing organizations to invest in a new wave of costs for a complete reorganization to provide more dynamic approaches to the data mesh.
By 2020, 50 percent of analytic queries will be generated using search, natural language processing (NLP) or voice, or will be generated automatically. The need to analyze complex data combinations and make analytics available to everyone in the organization will lead to its wider use, which will allow analytics tools to be as light as a search interface or a conversation with a virtual assistant.
Gartner predicts that by 2022, 75% of new end-user solutions that use AI and ML methods will be built on commercial solutions, rather than on open source platforms.
Commercial vendors integrate connectors into the open source ecosystem, thereby providing the corporate features needed to scale and democratize AI and ML, such as project and model management, reuse, transparency, lineage data, as well as consistency and integration with other platforms, what open platforms lack so much.
The main value of the blockchain and distributed ledger technologies is to provide decentralized trust in a network of untrusted participants. There is a significant potential for analytics use cases, especially those in which the relationships and interactions of the participants appear.
However, it will be several years before four or five core blockchain technologies begin to dominate. Until this time comes, the end users of technologies will be forced to adapt to the technologies and standards of the blockchain, which are dictated by the prevailing clients or networks. This includes integration with existing data infrastructure and analytics. Integration costs can exceed any potential benefits. The blockchain is a data source, not a database, and does not replace existing data management technologies.
New technologies with the use of persistent memory (persistent-memory technologies) will help reduce the cost and complexity of implementing architectures with support for computing in random access memory (IMC). Permanent memory represents a new level of memory between DRAM and NAND flash memory, which can serve as an economical storage device for high-performance loads. It has a certain potential that can be used to increase application performance, their availability, load time, clustering methods and security methods, while keeping costs under control. It will also help organizations reduce the complexity of their applications and data architectures by reducing the need for data duplication.
“The volume of data is growing rapidly, and the relevance of converting ordinary data to valuable in real time is growing with it,” said Feinberg. “New server loads require not only higher processor performance, but also more memory and faster data storage.”
For more information on using data and analytics to gain competitive advantage, see the Gartner Data & Analytics Insight Hub .
The Gartner Data & Analytics Summits in 2019 will be held March 4-6 in London , March 18-21 in Orlando , May 29-30 in Sao Paulo , June 10-11 in Dubai , September 11-12 in Mexico City , October 19-20 Frankfurt . Follow the news and updates on Twitter with the hashtag #GartnerDA .
Gartner, Inc. is a leading global scientific consulting company and member of the S&P 500. We provide business leaders with the necessary data, advice and tools to achieve their goals today and create successful organizations tomorrow.
Our unrivaled combination of expert, practical data research helps customers make the right decisions on the most important issues. We are a reliable consultant and an objective resource for more than 15,000 organizations in more than 100 countries - for all the main functions, in any industry and for companies of any size.
To learn more about how we help decision makers build the future of their business, visit gartner.com .
That's all. Write comments and see you on the course!
The focus of the Gartner Data & Analytics Summit on February 18-19 in Sydney was Augmented Analytics and Artificial Intelligence.
Advanced analytics , continuous intelligence (continuous intelligence) and explainable artificial intelligence (explainable artificial intelligence) are one of the biggest in the field of processing technology trends in data and analytics , which will have a destructive potential in the next 3-5 years, according to Gartner, Inc.
Speaking at the Gartner Data & Analytics Summit in Sydney, Rita SallamGartner’s vice president of research noted that data and analytics leaders should examine the potential impact of these trends on businesses and adjust their business models and operations accordingly, otherwise they risk losing competitive advantage over those who devote it enough attention.
“The history of analytics data processing continues to evolve, from supporting internal decision making to continuous intelligence, information products, and hiring data professionals"Said Rita Sallam. “It is very important to get a deeper understanding of the technological trends that underlie the creation and development of this story, as well as to set certain priorities for them, depending on the value for a particular business.”
According to Donald Feinberg , vice president and distinguished analyst at Gartner, the main problem caused by digital malfunction (too much data) has also opened up an unprecedented opportunity. The huge amount of data, coupled with the growing power of processing tools provided by cloud technologies, gives a clear understanding that it is now possible to train and execute algorithms on a large scale, necessary for the full realization of the potential of AI.
“The size, complexity, distributed nature of the data, the speed of work and the continuous intelligence necessary for the digital business make it clear that rigid and centralized architectures and tools can no longer cope,” says Feinberg. “The continued survival of any business will depend on a flexible, data-driven architecture that meets the ever-growing pace of change.”
Gartner recommends that leaders in the field of data processing and analytics discuss with business representatives the main priorities of the company and think about how they can integrate the following trends into their work.
Trend No. 1. Advanced analytics
Advanced analytics is the next wave of breakthroughs in the data processing and analytics market. It uses machine learning and artificial intelligence technologies to transform the development, consumption, and sharing of analytic content.
By 2020, advanced analytics will be the main engine of new purchases in analytics and BI, as well as Data Science, ML platforms and embedded analytics. Leaders in data processing and analytics are required to plan for the implementation of advanced analytics as the platform develops.
Trend No. 2. Advanced Data Management
Augmented Data Management technology uses ML capabilities and AI mechanisms to create categories of company information management, including data quality, metadata management, master data management, their integration, as well as self-tuning and self-tuning of database management systems (DBMS) . It automates many tasks and allows less skilled users to use the data themselves. In this way, highly qualified technicians can focus on more important tasks.
Advanced data management transforms metadatafrom those used only for audit, pedigree and reporting, eventually supplying them to dynamic systems. Metadata changes from passive to active and becomes the main engine for all AI / ML.
By the end of 2022, the number of tasks performed manually in the field of data management will decrease by 45% due to the introduction of machine learning and automated management of the level of service.
Trend No. 3. Continuous intelligence
By 2022, more than half of the new large business systems will use continuous intelligence, which in turn uses real-time contextual data to improve solutions.
Continuous intelligence is a design pattern in which real-time analytics integrates into business operations, processing current and historical data to propose actions in response to an event. It provides automation or decision support. Continuous Intelligence uses several technologies, such as advanced analytics, event flow processing, optimization, business rule management, and machine learning.
“Continuous intelligence is a major innovation in the work of data and analytics teams,” says Sallam. “This is a daunting task and a great opportunity for teams of analysts and BI experts to help companies make smarter decisions in real time as early as 2019. It can be considered as the final version of operational BI. ”
Trend No. 4. Explained AI
AI models are most often used to improve or completely replace a person in decision-making. However, in some scenarios, companies must justify how these models come to specific decisions. To build user or stakeholder confidence, application architects must make these models more understandable and understandable.
Unfortunately, most advanced AI models are complex black boxes that fail to explain how they came up with a specific recommendation or solution. Explained AI in data science and ML platforms, for example, automatically generates an explanation of models in terms of accuracy, attributes, statistics of models and functions in a natural language.
Trend No. 5. Graphics
Graph analytics is a set of analytical methods that allow you to explore the relationships between objects of interest, such as organizations, people, and transactions.
The use of graphical processing and graphical DBMSs will increase by 100% every year until 2022, which will accelerate data preparation and provide a more complex and adaptive data science.
Graphic data warehouses can efficiently model, explore and query data with complex relationships between data warehouses, but the need for specialized skills to work with them is their main limitation today.
Graphical analytics will grow steadily over the next few years, as there is a need to ask complex questions to complex data, which is not always practical or at least feasible on a scale where SQL queries can be used.
Trend No. 6. Data fabric
The data fabric provides unhindered access to and sharing of data in a distributed data environment. It is a single and consistent data management framework that provides unhindered access to data and the possibility of architectural processing in any other storage.
Until 2022, custom data fabric designs will be deployed primarily as a static infrastructure, forcing organizations to invest in a new wave of costs for a complete reorganization to provide more dynamic approaches to the data mesh.
Trend No. 7. NLP / Conversational Analytics
By 2020, 50 percent of analytic queries will be generated using search, natural language processing (NLP) or voice, or will be generated automatically. The need to analyze complex data combinations and make analytics available to everyone in the organization will lead to its wider use, which will allow analytics tools to be as light as a search interface or a conversation with a virtual assistant.
Trend No. 8 Commercial AI and ML
Gartner predicts that by 2022, 75% of new end-user solutions that use AI and ML methods will be built on commercial solutions, rather than on open source platforms.
Commercial vendors integrate connectors into the open source ecosystem, thereby providing the corporate features needed to scale and democratize AI and ML, such as project and model management, reuse, transparency, lineage data, as well as consistency and integration with other platforms, what open platforms lack so much.
Trend # 9: Blockchain
The main value of the blockchain and distributed ledger technologies is to provide decentralized trust in a network of untrusted participants. There is a significant potential for analytics use cases, especially those in which the relationships and interactions of the participants appear.
However, it will be several years before four or five core blockchain technologies begin to dominate. Until this time comes, the end users of technologies will be forced to adapt to the technologies and standards of the blockchain, which are dictated by the prevailing clients or networks. This includes integration with existing data infrastructure and analytics. Integration costs can exceed any potential benefits. The blockchain is a data source, not a database, and does not replace existing data management technologies.
Trend No. 10. Persistent memory servers
New technologies with the use of persistent memory (persistent-memory technologies) will help reduce the cost and complexity of implementing architectures with support for computing in random access memory (IMC). Permanent memory represents a new level of memory between DRAM and NAND flash memory, which can serve as an economical storage device for high-performance loads. It has a certain potential that can be used to increase application performance, their availability, load time, clustering methods and security methods, while keeping costs under control. It will also help organizations reduce the complexity of their applications and data architectures by reducing the need for data duplication.
“The volume of data is growing rapidly, and the relevance of converting ordinary data to valuable in real time is growing with it,” said Feinberg. “New server loads require not only higher processor performance, but also more memory and faster data storage.”
For more information on using data and analytics to gain competitive advantage, see the Gartner Data & Analytics Insight Hub .
Gartner Data & Analytics Summit
The Gartner Data & Analytics Summits in 2019 will be held March 4-6 in London , March 18-21 in Orlando , May 29-30 in Sao Paulo , June 10-11 in Dubai , September 11-12 in Mexico City , October 19-20 Frankfurt . Follow the news and updates on Twitter with the hashtag #GartnerDA .
About Gartner
Gartner, Inc. is a leading global scientific consulting company and member of the S&P 500. We provide business leaders with the necessary data, advice and tools to achieve their goals today and create successful organizations tomorrow.
Our unrivaled combination of expert, practical data research helps customers make the right decisions on the most important issues. We are a reliable consultant and an objective resource for more than 15,000 organizations in more than 100 countries - for all the main functions, in any industry and for companies of any size.
To learn more about how we help decision makers build the future of their business, visit gartner.com .
That's all. Write comments and see you on the course!