Blog Page

Uncategorized

Can real-time data visualisation deliver trust and opportunity? – ComputerWeekly.com

Getty Images
It is typical at this time of year to get a round of technology and business predictions from analysts and consultancies about what the future holds. Expect increased artificial intelligence (AI) and automation. More decarbonisation of the IT estate. Industry-specific clouds and more money spent on resilience and agility. Who doesn’t want a slice of that?
But hidden somewhere among the festive joy and doom and gloom is an idea that “trust”, an old Norse word apparently, is key to riding the waves of a volatile economy.
According to Sharyn Leaver, chief research officer at Forrester, “building trust among customers, employees and partners is a critical component to long-term growth”. When individuals have greater trust in companies, she says, “they are more deeply engaged and more willing to forgive a company’s mistakes”.
Leaver adds: “In making smart investments, focusing on their organisation’s mission and strengths, and establishing long-term trust with customers and employees, leaders will fuel their firms’ resiliency for the future.”
This is important because gaining the trust of all those stakeholders is no mean feat. Everyone is getting more demanding. Decisions need to be evidenced. Whether it’s telling a story to customers, partners, employees, suppliers, investors or executive-level decision-makers, the data has to be good quality, relevant and provide clarity. As Melody Chien, senior director analyst at Gartner, says: “Good-quality data provides better leads, better understanding of customers and better customer relationships. Data quality is a competitive advantage.”
For many organisations – and this will no doubt vary across vertical sectors – real-time or live streaming of data is seen as an important goal in delivering the sort of information needed to make quick decisions and build trust in projects, products and services. According to findings from a recent DataStax report, The state of the data race 2022, 78% of respondents (from 500 technology leaders and practitioners across a variety of industries) say real-time data is a “must-have,” not a “nice-to-have”.
There are so many levels here of what real-time data actually means to an organisation. How it is visualised for various stakeholders is key to its relevance and that, too, will vary depending on industry and specific requirements. As Dom Couldwell, head of field engineering EMEA at DataStax says: “Not all visualisations are created equal.”
How organisations want to use real-time data will, of course, vary and with that will come different needs for visualisation, the primary one being to make real-time decisions. There are cases where real-time visualisations are being used to educate and inform stakeholders of projects, while also enabling scenario planning. This is digital twin territory, a rapidly growing means of real-time visualisation that has found a home in infrastructure engineering projects.
“There is a productivity gap,” says Nicholas Cumins, COO at Bentley Systems, speaking at the firm’s annual Going Digital awards event in London recently. “We have to help bridge that gap with technology that enables customers to use their data more effectively.”
One of those customers is Dublin City University (DCU), which is using Bentley’s iTwin and OpenCities software to visualise multiple sets of campus data. Using sources such as internet of things (IoT) sound sensors, digital crowd counting and environmental sensors, DCU, which is part of the Smart Dublin programme, is creating simulations whereby autistic students, for example – DCU is a designated autism-friendly university – can see how to navigate through the campus in the least stressful way, using their mobile devices.
According to Kieran Mahon, smart DCU projects facilitator at the university, by visualising its data in real-time, DCU, in partnership with Insight, one of Europe’s largest data analytics centres, “can simulate scenarios and use predictive analytics to improve mobility and safety on campus”. It can also enable real-time monitoring of building performance, occupancy and insights into traffic and air quality.
We are seeing similar stories in other industries, where the presentation of real-time data is being used for functions such as predictive maintenance. Thameslink, a 24-hour mainline rail service, for example, uses Railigent X, a Siemens Mobility app built with Tibco Spotfire software. The app collects data that has been analysed at the edge to reduce latency, and visualises potential issues picked up through the sensors. Engineers can see the problem in advance and know exactly where on the train they need to go to fix it. 
“Advanced analytics tools able to visualise real-time data and historical data can immediately highlight critical conditions so that users can immediately spot and alert teams to intervene at an optimal time,” says Alessandro Chimera, director digitalisation strategy at Tibco. “Knowing well ahead of time when an asset will fail avoids unplanned downtimes and broken assets. On average, according to Deloitte, predictive maintenance increases productivity by 25%, reduces breakdowns by 70% and lowers maintenance costs by 25%.”
What all this shows is how data analytics and visualisation are evolving rapidly to suit industries’ specific needs. What is interesting is that so much of this is driven through an ecosystem of partners. No one organisation can deliver the breadth and depth of data and tools needed to make such projects work and there is much to learn from that. Collaborations and partnerships can elevate and enhance real-time data visualisation and value.
For many organisations however, real-time data is still virgin territory and real-time visualisation is one of those technologies where reality cannot hope to match expectation, at least according to Jaco Vermeulen, CTO of tech consultancy BML Digital.
“Almost every customer says they want real-time visualisation, but then nine out of 10 can’t qualify why they need it, especially when it comes to what decisions or actions it will enable,” says Vermeulen. “This is usually because they start from the belief that the data is always available and therefore should be immediately understandable and yield profound insight. The truth is a bit more challenging.”
Vermeulen’s thoughts are echoed by Tom Fairbairn, distinguished engineer at Solace. Fairbairn has worked with the London Stock Exchange, Unilever, Heineken and Nasa, among others, helping them to visualise their data to create critical insights and value.
“It is the real-time decisions that create impact,” he says. “Optimising supply chains, reducing waste and pollution, optimising operations, and informing and satisfying consumers. At the simplest level, some data is inherently real-time only. It’s no good getting information about a storm from a Nasa weather satellite 12 hours after a batch run. The command-and-control system for that satellite needs to make sure it’s tracking that storm as soon as it’s detected.”
Fairbairn talks about other scenarios, such as Unilever, which has created a Virtual Ocean Control Tower that plots the position, schedule and status of every vessel and container in its supply chain. This has enabled Unilever to make decisions about rerouting or rescheduling ships, containers and production jobs to optimise costs, mitigate delays, get goods to customers and minimise fuel burn.
“Of course, the great grandfather of real-time decision-making is in financial markets,” says Fairbairn. “The trading screen is the epitome of real-time data visualisation – and it is only useful if good decisions are made on that data in real time.”
Mike Smith, director of engineering at financial investing social network Commonstock, would agree. Smith says Commonstock’s use of DataStax’s Astra Streaming has become a critical technology in the infrastructure that fuels its platform, streaming real-time data to users without downtime. “With Astra Streaming, we can deliver in real time to our customers and help them get value from their data and wider market information in one place,” he says. “Visualising this data for customers to use, and make decisions with, is critical to our business.”
Nowhere is the critical nature of real-time more acute than in healthcare. The Welsh Ambulance Services NHS Trust (WAST) is one of the most clinically advanced ambulance services in the world, with call handlers and clinical contact-centre staff dealing with more than half a million calls 24/7, 365 days a year. Like any ambulance service currently stretched under the weight of demand and staff shortages, optimisation is key and real-time visualisation is playing its part.
Digital transformation business TPXimpact developed a Power BI dashboard that can monitor how ambulances are tracked, showing health professionals where ambulances are in the region and whether or not they have patients or are free for jobs. 
“One important thing we have learned with real-time visualisation is how much it can help people on the ground to make better decisions, which ultimately improve patient outcomes,” says Martyn Matthews, director of TPXimpact. “We can provide improved insight into what is actually happening, rather than what they might think is happening. The data is all there, but it can be challenging for organisations to gain a single overview and make this accessible to their teams.”
The data quality and management problem is a common thread. Getting consistent, quality, real-time data is a challenge, but this is a fast-moving space, where machine learning and AI will make a huge difference in how data is sourced and “cleaned” and how insights, and therefore decisions, are reached.
This view is supported by Tanya Hyams-Young, CEO of SourseAI, who says: “There are very few places where a real-time flow of data and dynamic decisions are genuinely required.” She has an AI model to predict customer behaviours and spot opportunities for improving product, service and customer revenue, with little to no hint of visualisation.
It is a different take, a different direction and one that is gaining some momentum, not least among what Gartner calls the “visionaries” in analytics and business intelligence (BI) platform companies.
“Dead-end dashboards offer inert, dead data,” says Damien Brophy, vice-president EMEA at ThoughtSpot, arguing that the future has to be self-service and AI-enabled analytics. “Data can go stale quickly and dashboards are now merely a way of serving up old news,” he adds.
From a real-time data viewpoint, there seems to be some sense in that, but as with static data visualisation, there is an embedded audience that takes time to change. What is clear is that real-time visualisation may not be for everyone, but as a tool for proving cases and delivering trust to stakeholders, it is increasingly valuable.
As edge computing continues to evolve, organizations are trying to bring data closer to the edge. We identify the top trends they…
While organizations like The Brookings Institution applaud the White House’s Blueprint for an AI Bill of Rights, they also want …
Earth observation is a primary driver of the global space economy and something federal agencies are partnering with commercial …
Claroty’s attack technique bypasses web application firewalls, or WAFs, by tricking those that can’t detect JSON as part of their…
This Risk & Repeat podcast episode discusses the recent ransomware attack against cloud provider Rackspace, as well as the major …
New research from Palo Alto Networks supports recent government warnings that Vice Society poses an increased risk to K-12 …
Juniper simplifies Kubernetes networking on Amazon’s Elastic Kubernetes Service by adding virtual networks and multi-dimensional …
A network disaster recovery plan doesn’t always mean network resilience. Learn how factors like funding, identifying potential …
Cisco SD-WAN 17.10 enhancements give enterprises the option of using security service edge providers Cloudflare and Netskope in …
Data center standards help organizations design facilities for efficiency and safety. Organizations can use BICSI and TIA …
DCIM tools can improve data center management and operation. Learn how six prominent products can help organizations control …
A fire in a data center can damage equipment, cause data loss and put personnel in harm’s way. Look to NFPA fire protection …
Data marts and data warehouses both play key roles in the BI and analytics process. Here’s how they differ and how they can be …
User-defined functions land in Cockroach Labs’ new database update aiming to improve application development. The release also …
During the pandemic, Disney revamped its data integration process after the media and entertainment giant’s existing data …
All Rights Reserved, Copyright 2000 – 2022, TechTarget

Privacy Policy
Cookie Preferences
Do Not Sell My Personal Info

source