How the HEAVY.AI platform accelerates geospatial intelligence and delivers advanced analytics and real-time data visualizations that help telcos, utilities, and government agencies improve operations and minimize risk. Credit: -mosquito- / Getty Images In today’s data-dependent world, 2.5 quintillion bytes of data are created every day. By 2025, IDC predicts that 150 trillion gigabytes of real-time data will need analysis daily. How will businesses keep up with this incomprehensible amount of data and make sense of the vast amounts of data they are dealing with now and for the future? Traditional analytical methods choke on the volume, variety, and velocity of data being collected today. HEAVY.AI is an accelerated analytics platform with real-time visualization capabilities that helps companies leverage readily available data to find risks and opportunities. Accelerated geospatial analytics The HEAVY.AI platform offers a myriad of features to better inform your most critical decisions with stunning visualizations, accelerated geospatial intelligence, and advanced analytics. HEAVY.AI converges accelerated analytics with the power of GPU and CPU parallel compute. There are five core tools that make up the HEAVY.AI platform. These include, Heavy Immerse, Heavy Connect, Heavy Render, HeavyDB and HeavyRF. Heavy Immerse is a browser-based data visualization client that serves as the central hub for users to explore and visually analyze their data. Its interactive data visualization works seamlessly with the HEAVY.AI server-side technologies of HeavyDB and Heavy Render, drawing on an instantaneous, cross-filtering method that creates a sense of being “at one with the data.” With Heavy Immerse, users can directly interact with dynamic, complex data visualizations, which can be filtered together and refreshed in milliseconds. Users can place charts and complex visualizations within a single dashboard, providing a multi-dimensional understanding of large datasets. Heavy Immerse also offers native cross filtering with unprecedented location and time context, dashboard auto-refresh, no-code dashboard customization and a parameter tool, all which can be used to make various tasks more efficient, dramatically expanding an organization’s ability to find previously hidden opportunities and risks in their enterprise. HEAVY.AI A HEAVY.AI data visualization demo using New York City taxi ride data. HeavyDB is a SQL-based, relational and columnar database engine specifically developed to harness the massive parallelism of modern GPU and CPU hardware. It was created specifically so that analysts could query big data with millisecond results. Working in tandem with HeavyDB, the Heavy Render rendering engine connects the extreme speed of HeavyDB SQL queries to complex, interactive, front-end visualizations offered in Heavy Immerse and custom applications. Heavy Render creates lightweight PNG images and sends them to the web browser, avoiding large data volume transfers while underlying data within the visualizations remain visible, as if the data were browser-side, thanks to HeavyDB’s fast SQL queries. Heavy Render uses GPU buffer caching, modern graphics APIs, and an interface based on Vega Visualization Grammar to generate custom point maps, heatmaps, choropleths, scatterplots, and other visualizations with zero-latency rendering. With Heavy Connect, users can immediately analyze and visualize their data wherever it currently exists, without the need to export or import data or duplicate storage. This effectively eliminates data gravity, making it easier to leverage data within the HEAVY.AI system and derive value from it. Heavy Connect provides a no-movement approach to caching data that allows organizations to just point to their data assets without ingesting them into HeavyDB directly. This makes data readily available for queries, analysis, and exploration. Through HEAVY.AI’s platform integration with NVIDIA Omniverse, the HeavyRF radio frequency (RF) propagation module is an entirely new way for telcos to connect their 4G and 5G planning efforts with their customer acquisition and retention efforts. It is the industry’s first RF digital twin solution that enables telcos to simulate potential city-scale deployments as a faster, more efficient way of optimizing cellular tower and base station placements for best coverage. With the power of the HeavyDB database, HeavyRF can transform massive amounts of LiDAR 3D point cloud data to a high-fidelity terrain model. This allows for the construction of an incredibly high-resolution model of the buildings, vegetation, roads, and other features of urban terrain. Impactful geospatial use cases HEAVY.AI delivers many different benefits across the telco, utilities, and public sector spaces. For example, in the telco sector, organizations can utilize HeavyRF to more efficiently plan for their deployments of 5G towers. HeavyRF allows telcos to minimize site deployment costs while maximizing quality of service for both entire populations and targeted demographic and behavioral profiles. The HeavyRF module supports network planning and coverage mapping at unprecedented speed and scale. This can be used to rapidly develop and evaluate strategic rollout options, including thousands of microcells and non-traditional antennas. Simulations can be run against full-resolution, physically precise LiDAR and clutter data interactively at metro regional scale, which avoids downsampling needs and false service qualifications. Utility providers also benefit from accelerated analytics and geospatial visualization capabilities. Using HEAVY.AI, utility providers monitor asset performance, track resource use, and identify unseen business opportunities through advanced modeling, remotely sensed imagery, and hardware-accelerated web mapping. In addition, their analysts, scientists, and decision-makers can quickly analyze data related to catastrophic events and develop effective strategies for mitigating natural disasters. For example, wildfires are often caused by dead trees striking power lines. Historically, utilities managed this problem by sending hundreds of contractors to manually visit lines and look for dying vegetation. This was an expensive, time-consuming, and imprecise process, typically with four-year revisit times. More recently, utilities have been able to analyze weekly geospatial satellite data to pinpoint locations with the worst tree mortality. Equipped with these granular insights, utilities can determine where dead trees and power lines are most likely to come into contact, then take action to remove vegetation and avoid catastrophe. One East Coast utility, for example, found that more than 50% of its outage risk originated from 10% of its service territory. Since major utilities spend hundreds of millions of dollars per year on asset and vegetation management, even modest improvements in targeting can have large positive impacts on both public safety and taxpayers’ wallets. The benefits of accelerated analytics does not stop there. In the public sector, federal agencies have the power to render geospatial intelligence with millisecond results, or to accelerate their existing analytics solutions at incredible speeds. HEAVY.AI is capable of cross filtering billions of geo data points on a map to run geo calculations at a scale far beyond the ability of existing geospatial intelligence systems. These advancements in geospatial analysis unlock a wealth of new use cases, such as All-Source Intelligence analysis, fleet management, logistics operations, and beyond. For telcos, utilities, the public sector, and other organizations all over the world, data collection will continue to expand and each decision based on those massive datasets will be critical. By bringing together multiple and varying data sets and allowing humans to interact with their data at the speed of thought, HEAVY.AI enables organizations to make real-time decisions that have real-life impacts. Dr. Michael Flaxman is the product manager at HEAVY.AI. In addition to leading product strategy at the company, Dr. Flaxman focuses on the combination of geographic analysis with machine learning, or “geoML.” He has served on the faculties of MIT, Harvard, and the University of Oregon. — New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to newtechforum@infoworld.com. Related content analysis 7 steps to improve analytics for data-driven organizations Effective data-driven decision-making requires good tools, high-quality data, efficient processes, and prepared people. Here’s how to achieve it. By Isaac Sacolick Jul 01, 2024 10 mins Analytics news Maker of RStudio launches new R and Python IDE Posit, formerly RStudio, has released a beta of Positron, a ‘next generation’ data science development environment based on Visual Studio Code. By Sharon Machlis Jun 27, 2024 3 mins Integrated Development Environments Python R Language feature 4 highlights from EDB Postgres AI New platform product supports transactional, analytical, and AI workloads. By Aislinn Shea Wright Jun 13, 2024 6 mins PostgreSQL Generative AI Databases analysis Microsoft Fabric evolves from data lake to application platform Microsoft delivers a one-stop shop for big data applications with its latest updates to its data platform. By Simon Bisson Jun 13, 2024 7 mins Microsoft Azure Natural Language Processing Data Architecture Resources Videos