Puneet Sran

Data Engineering Professional

High
Business Impact
13+
Data Pipelines Built
3+
Years Experience
100%
Pipeline Reliability

About Me

Data Engineering Professional with a Master's in Data Science and hands-on experience in SQL, Python, and cloud-based data platforms (GCP, BigQuery). I specialize in building scalable data pipelines that drive real business impact.

At TELUS, I've built data-driven systems that drive significant business impact, designed 13+ production data pipelines, and built fault-tolerant ETL systems that process large volumes of customer data daily.

CHLOE Awards 2024 Nominee - "Courage to Innovate"
Master's in Data Science (GPA: 3.85)

Technical Skills

Programming Languages

Python SQL R JavaScript C Markdown

Cloud & Data Platforms

Google Cloud Platform BigQuery Adobe Experience Platform Google Cloud Storage Cloud Workflows Cloud Scheduler Secret Manager Looker Studio Tableau

Data Engineering

ETL/ELT Pipelines Data Modeling Pulumi (IaC) YAML Configuration Batch Processing Workflow Orchestration CI/CD Pipelines

Machine Learning

pandas NumPy Scikit-Learn TensorFlow PyTorch NLP

Professional Experience

Data Engineer

TELUS Communications Inc. Mar 2024 - Present Vancouver, BC
Built data-driven systems that drive significant business impact
  • Cart Abandonment Recovery System: Built sophisticated multi-stage data pipeline identifying cart abandoners across multiple product lines with complex SQL logic for consent management, duplicate prevention, and multi-brand customer identification, generating significant recovered revenue
  • Email Attribution Dashboard: Developed comprehensive revenue attribution system tracking customer journeys from email campaigns to purchases across multiple brands with 7-day attribution windows and complex product pricing calculations, powering executive Looker Studio dashboards
  • AJO Anomaly Detection System: Created automated anomaly detection system monitoring Adobe Journey Optimizer campaigns with containerized Python scripts, batch job orchestration, and intelligent alerting proactive campaign management
  • Enterprise Data Pipeline Architecture: Built 13+ production data pipelines processing large volumes of customer data daily with high reliability, orchestrating end-to-end data flows between cloud data warehouse and marketing automation platforms
  • Batch Processing & Workflow Orchestration: Implemented fault-tolerant ETL workflows using Cloud Workflows, batch job execution, and automated monitoring with daily alerting for ingestion failures and delivery issues
  • Infrastructure as Code: Developed and maintained enterprise data infrastructure using Pulumi, managing cloud resources, datasets, stored procedures, and deployment configurations across multiple environments with versioned deployments
Python SQL GCP BigQuery Adobe Experience Platform Pulumi

Developer Analyst I & II

TELUS Communications Inc. Aug 2021 - Feb 2024 Vancouver, BC
  • Analytics Implementation: Led implementation of analytics solutions using dataLayer and Adobe Launch across multiple digital properties
  • Stakeholder Management: Collaborated with product owners and data analysts to translate business requirements into technical solutions
  • Data Accessibility: Enhanced data usability and accessibility for cross-functional teams through improved data architecture
Adobe Launch JavaScript OneTrust Data Analytics

Mechatronic Engineer

Nedieon Inc. Mar 2020 - Feb 2021 Port Coquitlam, BC
  • Automation Design: Designed autonomous golf ball collection robot, eliminating manual labor and reducing operational costs for golf courses
  • Sensor Programming: Programmed various sensors and control systems using Python and C for real-time object detection and navigation
  • Engineering Analysis: Performed mechanical assembly, Finite Element Analysis, and system control using Arduino and ESP32 microcontrollers
Python C Arduino ESP32 FEA

Featured Projects

Cart Abandonment Recovery System

Data Engineering • Revenue Generation

Sophisticated multi-stage data pipeline system that identifies cart abandoners across multiple product lines, then orchestrates targeted email campaigns through marketing automation platforms to recover lost revenue.

Significant revenue recovery through intelligent customer journey orchestration and targeted email campaigns
Complex SQL logic with consent management, duplicate prevention, and multi-brand customer identification
Automated batch processing with cloud data warehouse to marketing platform integration and daily alerting
Cloud Data Warehouse Marketing Automation Platform Cloud Workflows Infrastructure as Code Python

Email Campaign Attribution Dashboard

Business Intelligence • Revenue Attribution

Comprehensive revenue attribution system tracking customer journeys from email campaigns to purchases across multiple brands with sophisticated product revenue calculations.

Multi-brand revenue attribution with 7-day attribution windows and complex product pricing logic
Cross-platform data integration linking email events, clickstream data, and purchase transactions
Powers executive Looker Studio dashboards for real-time campaign performance monitoring
BigQuery SQL Looker Studio Cloud Scheduler Revenue Attribution

AJO Campaign Anomaly Detection System

Data Engineering • Monitoring & Alerting

Automated anomaly detection system monitoring marketing automation campaigns with intelligent alerting for proactive campaign management.

Automated anomaly detection with containerized Python scripts and batch job orchestration
Daily alerting system for anomaly notifications
Proactive campaign monitoring preventing revenue loss through early anomaly detection
Python Docker Cloud Workflows Cloud Scheduler Secret Manager

Enterprise Data Analytics Platform

Data Engineering • Enterprise Scale

At TELUS, I've built data-driven systems that drive significant business impact, designed 13+ production data pipelines, and built fault-tolerant ETL systems that process large volumes of customer data daily.

Multi-domain architecture with batch processing, streaming data flows, and automated alerting systems
Cloud Workflows orchestrating complex ETL processes with fault-tolerant batch job execution
Comprehensive data governance with managed datasets, stored procedures, views, and automated schema management
Python Pulumi BigQuery GCP YAML

Dating a Scientist

Machine Learning • Data Science

Advanced clustering analysis on OkCupid dataset to identify user behavior patterns and improve matching algorithms.

User Segmentation using KMeans, Agglomerative Clustering, and DBSCAN
Feature Engineering with Min-Max scaling and advanced encoding techniques
Actionable insights for user engagement and retention strategies
Python Scikit-Learn Pandas Clustering

Complete Poetry

Deep Learning • NLP

RNN-based deep learning model for automated poetry generation, exploring AI creativity and language modeling.

Trained on dataset of 15,652 poems for comprehensive language understanding
Multi-layer RNN architecture for improved vocabulary and style generation
Research insights into challenges and limitations of AI-driven creative tasks
TensorFlow Keras RNN NLP

Let's Connect

I'm always interested in discussing data engineering opportunities, collaborating on innovative projects, or sharing insights about the latest in data technology and machine learning.