I build the analytics infrastructure that turns raw, inconsistent data into something a business can
actually act on. Pipelines, dashboards, data quality work, and the stakeholder communication that makes
it land.
Three years. Two industries. End-to-end ownership from day one.
I got into analytics the hard way. I started as a data annotator, taught myself the stack, and made the case to transfer into the analytics function. Within three years I was the primary analytics resource for the team, responsible for both internal reporting and every customer-facing output.
Over three years I have worked across financial services and tech operations, building analytics end-to-end. That means data pipelines and gold-layer sources in Databricks, executive reporting systems in Tableau and Power BI, and the data quality work that makes all of it trustworthy. < At Nokia I took the same approach into a strategy and operations context, delivering Power BI solutions and automated workflows for a global telecom team.
That instinct to find the work rather than wait for it is still how I operate.
Nokia
Strategy & Operations Co-op Analyst
Sep 2025 – Dec 2025
Prodigal Technologies
Business Intelligence Analyst
Aug 2022 – Aug 2025
Collections environments generate large volumes of call data but most operational teams have no structured way to monitor agent performance or call outcomes without manual digging. I built this project to demonstrate what that reporting infrastructure looks like when properly architected, using a synthetic dataset modeled on real industry behavior including time-of-day trends and holiday patterns.
A working reporting architecture that mirrors what a production collections environment looks like, built to show end-to-end thinking from data generation through to executive-facing visualization.
Research partner registrations were being managed entirely through email and spreadsheets. There was no way to track participation, match faculty systematically, or give leadership any view into what was happening across the program.
Eliminated manual coordination overhead entirely. Leadership went from zero visibility into the program to a real-time dashboard. Handed off with full documentation and built to scale across the broader organization.
Generating accurate text descriptions for images requires a deep understanding of visual patterns. I developed this AI system to automate image captioning with a focus on model performance and security.
Achieved a competitive 0.33 BLEU score, demonstrating the model ability to handle complex outdoor scenes and human behavior. The final product provides a scalable foundation for automated alt text and visual data indexing.
Navigating graduate school applications involves significant uncertainty. I built this predictive tool to help applicants quantify their admission chances based on standardized academic metrics and research profiles.
The application successfully bridges the gap between raw statistical data and user friendly guidance. It provides prospective students with an immediate, data driven benchmark for their UCLA graduate program applications.
Real estate pricing is influenced by a complex mix of property features and market cycles. I built this tool to provide homeowners and investors with an objective, data driven estimate that considers both house specific details and the wider economic climate.
The model achieved a high level of reliability with an R squared score of 0.79, successfully capturing nearly 80 percent of the price variance. This provides a clear, formatted valuation that helps users navigate market fluctuations with greater confidence.