Explore the Agenda
7:30 am Check-In, Coffee, & Networking
8:30 am Workshop A: Hosted by Procore
9:30 am Morning Networking Break
Series A: Data Engineering & Governance
Workshop B
Monday 20th April 2026
Monday 20th April 2026
10:00 am Building & Evolving a Data Governance Framework in a Dynamic Data Landscape to Remain Agile
Your firm’s data governance policy is never fully finished, especially as the construction industry pushes towards more advanced analytics, the volume of project data explodes, and project partner relationships become increasingly complex.
This session will guide attendees through the practical steps of building and evolving a governance framework to ensure that your data is accurate, consistent, and ready to drive smarter decisions as your analytics toolkit expands.
Join forces with fellow analytics professionals to:
- Evaluate whether your data governance frameworks should live within your programs or be physically documented
- Hear lessons learned on balancing formal documentation such as data dictionaries, rules, access levels, and policy maintenance with practical implementation – where does the responsibility lie and how can you incorporate decisions about how end users interact with governed data?
- Understand how to map and model data across multiple systems, including sensitive data
- Navigate governance challenges around security access and strategic layering of AI tools to ensure your framework evolves in line with enterprise needs
Series B: Analytics Insights & Visualization
Workshop E
Monday 20th April 2026
Monday 20th April 2026
10:00 am Ensuring Dashboards Deliver Value to the End User to Support Their Decision Making
Dashboards and insights only deliver value if they align with the needs of your internal customers, which is paramount when becoming a truly data-driven organization and retaining a competitive edge.
This session will explore how construction firms can design analytics that go beyond reporting to drive actionable insights, integrating stakeholder feedback to refine analytics.
Leave with a vision of how to:
- Gain stakeholder trust by explaining data sources, processes, and insights to address misalignment between perceptions and what the data show
- Establish methods to evaluate dashboard usage, identify high and low-value reports, and adapt analytics platforms to deliver what teams truly need for daily operations, decision-making, and resource planning
- Create an effective feedback loop between end users and analytics teams to develop iterative approaches to refine dashboards based on evolving needs
- Empower teams to create their own dashboards and reports aligned with their needs
12:00 pm Networking Lunch Break
Workshop C
1:00 pm Fixing Incomplete or Dirty Data to Enable High Quality Analytics
As firms scale and tech stacks grow, data flowing from multiple systems into a lake and erroneous entry can create gaps or anomalies that limit the insights that can be drawn from the data. Analytics success often hinges on patching, integrating, and enriching data, but this can often be a time-consuming, manual process.
In this interactive workshop you will have the opportunity to level up your data patching and cleaning.
Collaborate with your peers to gain practical and technical insights into:
- Uncovering why incomplete or inconsistent connections between sources can create significant challenges with data quality, and how to efficiently find and fix the root cause
- Determining how far back you should clean or enrich your firm’s data to balance ROI and effort when deciding what data to trust, clean, or ignore
- Deciding which systems to clean first to create a few clean anchor projects to build best practices that can scale
- Exploring how to leverage automation and AI-driven validation to enhance data integrity and analytics through anomaly detection and mission information repair for stronger analytics outcomes and better data structure
- Establishing the risks of implementing these validation techniques given potential unreliability
CASE STUDY
Workshop F
1:00 pm Creating Robust KPIs for Your End Users to Reflect True Project & Firm Health
Creating meaningful KPIs in construction to measure project and firm health goes beyond pulling data. It’s about defining metrics that truly reflect productivity, quality, safety, and operational performance.
This session will explore how to convert raw data into actionable KPIs and align analytics with real-world business operations and objectives.
Dive into this interactive session to:
- Map out the data sources available to you to create KPIs, and how they complement each other
- Explore how to identify the right productivity, quality, safety, and operational metrics that reflect your firm’s work, outputs, processes, and overarching goals
- Understand how data collected manually versus automated or in the field versus the office creates challenges in establishing robust, repeatable KPIs
- Overcome the challenges of creating KPIs for qualitative data, such as quality or safety, versus quantitative data, such as financial numbers
- Learn practical methods for developing KPIs with your internal customers through team workshops to understand their objectives, iterative analysis of available system data, and collaboration with field staff
CASE STUDY
3:00 pm Afternoon Networking Break
Workshop D
3:30 pm Advancing Traditional Machine Learning to Achieve Effective Predictive Modeling
While 2025 has been the year of the LLM, the most innovative, forward-thinking firms will be those whose analytics also integrate machine learning and neural networks to unlock greater predictive modeling and forecasting.
This session is the ultimate deep dive into how to build relevant machine learning algorithms for construction decision making, explore transformer architectures to automate text-heavy workflows, and determine the best applications for machine learning to unlock greater value from your data.
Stay ahead of the curve by:
- Defining the practical, effective machine learning and neural network applications in construction from classification and linear regression to predictive modeling for forecasting and optimization
- Exploring how to use transformer architectures to process large volumes of text to automate workflows, interpret stakeholder needs, and streamline decision-making, including how no SQL databases can support these workflows
- Evaluating hybrid approaches between AI and machine learning to enable natural language interfaces, enhance analytics insights, and improve operational efficiencies given the limits of LLMs in numeric or predictive tasks
CASE STUDY
Workshop G
3:30 pm Creating Robust Conversational Analytics Tools to Increase End User Accessibility
As analytics teams strive to foster data-driven decision making across every layer of the enterprise, improving the accessibility of the data is paramount. LLMs have sparked a revolution in conversational analytics using AI agents to refine the user interface and integrate analytics seamlessly into daily workflows.
Yet, challenges remain in ensuring these data queries return accurate, unbiased insights.
Level up your practical analytics workflows by:
- Learning how to build AI agents and generative tools based on your data lake to allow stakeholders to query data directly without needing deep knowledge of data structuring or dashboards, accelerating decision making and analytics adoption with field users
- Evaluating how generative AI can create new outputs beyond text-based answers as part of workflows
- Understanding which parts of your firm are best suited to the implementation of these tools to reduce the risk of incorrect decisions
- Creating backstops to ensure LLM-driven insights are reliable and actionable within these defined workflows
- Adopting strategies for making conversational analytics as effective as possible, including prompt engineering and contextual queries for better results to educate end users on how specificity and context directly influence the value of AI outputs
CASE STUDY