SERVICES
The ultimate purpose of any data pipeline is the ability to present insights clearly and convincingly to decision-makers – often non-technical users whose primary analytical tool is still Excel.
Presentation Layer Options
Within Databricks, Dashboards are frequently used to surface data to this audience and provide quick internal visibility.
However, anyone who has relied on Databricks Dashboards as a presentation layer will be familiar with their limitations. Interactivity is constrained, large datasets are often truncated, version control is minimal, cross-dashboard interactions are not supported, and geospatial visualisation options are fairly limited and difficult to customise.
Databricks Apps offer an alternative and enables fully interactive, application-style experiences where non-technical users can explore data intuitively. For Data and AI teams, Databricks Apps also represent the fastest path to building and deploying internal tools directly on the Databricks Data Intelligence Platform, without introducing external infrastructure or tooling.
Apps run on top of Unity Catalog, meaning all permissions, row-level filters, and column masking rules defined centrally apply automatically inside the app. Business users only see the data they are authorised to access, with no additional security logic required in the app code.
Demo Use Case
I explored Databricks Apps as a presentation layer for a location-aware retail analytics use case, with a strong emphasis on geospatial analysis.
Imagine a large UK grocery retailer’s marketing team asking a familiar question: which customer segments should we target to maximise the impact of our next campaign, and where? Rather than running broad, national promotions, the goal is to design campaigns that are locally relevant, shaped by who actually lives around each store and how those customers behave.
For this demo I used an open-source Kaggle dataset as a proxy for transactions and customer behaviour, combined with a real UK supermarket locations dataset.
Customers were assigned home locations and core personas based on UK population distributions from ONS data. Using a BisectingKMeans unsupervised learning model, customers were clustered into behavioural segments. To make those segments interpretable and business-friendly, a LLM was used to generate concise personas and descriptions for each cluster.
The App
With all this in place, created Streamlit Databricks App brings everything together: exploring the mix of customer segments within each store’s catchment area, visualising geographic patterns, and simulating campaign scenarios to understand where targeted interventions could drive the greatest uplift in visits and spend.
The result is an interactive, location-aware decision surface – built entirely within Databricks – that bridges advanced analytics and real-world retail decision-making.
Below is an example of a store manager’s journey from local insight to campaign decision:
Engineer POV
From an application development standpoint, Databricks Apps significantly reduced friction. The emphasis remained on solving real business problems, not on managing infrastructure, security, or deployment complexity.
This made it possible to build and iterate on the app quickly, while staying fully within the Databricks platform. Easy to build and debug locally, seamless workspace sync with live updates (‘databricks sync –watch’), and straightforward CI/CD using Databricks Asset Bundles.
One current consideration is that Databricks Apps do not yet support automatic idle shutdown, which means long-running apps can incur ongoing compute costs if left running continuously.
This is already on the Databricks roadmap, and in the meantime, apps can be programmatically started and stopped using the Databricks APIs, making it straightforward to integrate cost controls into existing operational workflows.

