Thought leaders weigh in on what we can expect in the IT space throughout the next year.

Otakar Nieder, senior director at Bohemia Interactive Simulations
The AI revolution in the form of machine and deep learning continues to gather more and more traction in the software development industry. With the introduction of graphics processing unit (GPU) acceleration, previous time and computing restrictions are removed and new easy to use frameworks and data centers will make these technologies available to everyone in 2019.

Stephan Ewen, co-founder and CTO of data Artisans
5G, and the proliferation of sensors and IoT devices, will create more real-time streaming data, and more use cases that need instant reaction to events. Stream processing will be used as an efficient way to realize “edge computing.” Stream processing is a great match both for pre-processing data on devices or gateways, and for running event-driven logic on the edge.

Gadi Oren, VP of products at LogicMonitor:
AIOps and the “right type” of monitoring: Within the next 3 years, companies will figure out the right mix of signals and feedback for machine learning and will create a breakthrough in monitoring strategies. The first to leverage this new strategy after gathering the right data will have the key advantage in the market. Ultimately, these tools will increase team efficiencies by enabling teams, which used to require experts, to operate through generalists providing considerable customer value.

Richard Whitehead, chief evangelist at Moogsoft
Reducing the final 1% of IT operational noise: When we apply automation technologies to IT operations today, we’re able to see up to a 99 percent reduction in noise. Over the next decade, experts and innovators will continually be pushing to eliminate that extra 1 percent. While at first that might sound like such a minimal accomplishment, the exponential increase in data and information makes this a monumental task.

Eric Han, VP of product management at Portworx
Containers find a new use case: machine learning. Containers and machine learning are two of the hottest trends in tech, but they aren’t often talked about together. In 2019, that will change as businesses start to recognize the benefits of running machine learning workloads in a containerized environment. There are several benefits to combining these technologies: The ability to cluster and schedule container workloads allows containerized machine learning applications to scale easily. And machine learning processes that exist inside of containers can be exposed as microservices, allowing them to be easily reused across applications. As enterprises look to cognitive technologies to transform their businesses, they will see the benefits of bringing machine learning workloads to highly flexible and scalable cloud-native environments.

Dan Sommer, Qlik senior director and global market intelligence lead
Kubernetes comes of age, microservices become the norm

One of the biggest unsung megatrends of today is microservices and Kubernetes. In the span of a year, it’s gone from emerging to becoming a hygiene factor, where app dev teams at leading enterprises are now orchestrating container-based applications, and demanding production Kubernetes environments.

John Morello, CTO of Twistlock
Cloud sprawl is real and getting worse as more and more cloud services are introduced.  Just as with server sprawl and VM sprawl before it, the challenge with cloud sprawl is governance and knowing what you actually have running as you can’t secure what you don’t know about.  Cloud providers make it so easy and seamless to create new services that it’s easy to experiment, move on, and then forget that you’ve deployed a database or app exposed to the world. Organizations should stress operational discipline like using automation for all deployments, so there’s clear boundaries, a defined process, and a basic record of the services they’re using.