Learn Data Skills
Simon Akhmedov

Simon Akhmedov

Senior Data Engineer

Transamerica | San Jose, CA


My Portfolio Highlights

My New Course

Introduction to SQL

My New Track

Understanding Data Topics

My New Workspace

Course Notes: Time Series Analysis in SQL Server

Analytical wordsmith, crafting compelling narratives through data storytelling.

My Work

Take a look at my latest work.


Course Notes: Introduction to SQL Server


Course Notes: Time Series Analysis in SQL Server


Analyzing unicorn company data

My Certifications

These are the industry credentials that I’ve earned.

Other Certificates

Microsoft Microsoft Certified: Azure Fundamentals

DataCamp Course Completion

Take a look at all the courses I’ve completed on DataCamp.

My Work Experience

Where I've interned and worked during my career.

Transamerica | Jan 2023 - Present

Senior Data Engineer

· Play a pivotal role within the Data Engineers team, consistently achieving exceptional results in meeting development requirements by proficiently leveraging T-SQL, SSIS (SQL Server Integration Services), ADF2 (Azure Data Factory), Power BI, Tableau, and various Azure tools; · Conduct daily scrum meetings with Project Managers, Business Analysts, and end users/clients to systematically gather, analyze, and document business requirements and business rules; · Assist and support the Project Manager in leading multifunctional project teams, facilitating the implementation of data conversion strategies, and ensuring effective communication of critical information to team members; · Demonstrate extensive proficiency with SQL Server and T-SQL, actively involved in the implementation, maintenance, and development of Stored Procedures, Triggers, Nested Queries, Joins, Views, User Defined Functions, Indexes, User Profiles, Relational Database Models, as well as table creation and updates. Ensured database consistency through the execution of DBCC (Database Console Commands) commands; · Possess expertise in working with Azure Resource Manager (ARM) templates, Azure Functions, Logic Apps, Azure Kubernetes Service (AKS), Azure Container Registry (ACR), Azure Cosmos DB, ADF (Azure Data Factory), Azure Data Lake Storage, and Azure Stream Analytics; · Developed and deployed numerous Azure-based applications, capitalizing on services such as Azure Virtual Machines, Azure Storage, Azure SQL Database, and Azure Active Directory; · Enforce industry-leading methodologies within SQL database projects, incorporating practices such as employing version control through Azure DevOps and GitHub, conducting code reviews, and establishing continuous integration and deployment (CI/CD) pipelines; · Construct efficient ETL (Extract, Transform, Load) packages using both SSIS and ADF2 pipelines for processing Fact and Dimension tables, encompassing complex transformations and the management of SCD (Slowly Changing Dimensions) type 1 and type 2 changes. Developed workflows employing Python and ADF2 to automate data flow; · Implement incremental data loading methodologies using SSIS packages, SQL tasks, and merge operations to achieve data synchronization between the Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) databases; · Participate in the development of Power BI and Tableau reports and dashboards tailored to various domains, including Finance, Accounting, Management, and other client teams; · Leverage Power BI to create filters, reports, and dashboards while incorporating advanced features such as diverse chart types, visualizations, and intricate calculations to effectively manipulate data; · Enforce data security and governance protocols in Snowflake, including the implementation of role-based access control, row-level security, and data masking, ensuring data privacy and compliance with regulatory standards; · Construct data pipelines using Snowflake tasks and streams, enabling real-time data ingestion and processing for operational reporting and analytics.
Show More

STL Truckers LLC | Apr 2021 - Dec 2022

BI Developer

· Assumed a leadership role within the BI Team to establish comprehensive guidelines and procedures for data extraction, ingestion, modeling, and analytics. Led technical planning and requirements gathering phases, encompassing activities like architectural design, development, testing, and successful project delivery; · Developed intricate logic and implemented indexed tables, views, and stored procedures tailored to specific business requirements and rules. Proficiently crafted complex T-SQL queries, incorporating advanced SQL functionalities such as intricate joins, Merge operations, Except commands, and more; · Engaged in SSIS development and provided robust ETL (Extract, Transform, Load) solutions for integrating data from diverse sources, including Flat Files (delimited, fixed width), Excel, SQL Server, Raw Files, and DB2. Facilitated seamless data integration into the central Online Transaction Processing (OLTP) database; · Architected and implemented Extract, Transform, Load (ETL) workflows utilizing a comprehensive stack of tools, including SQL Server Integration Services (SSIS), Azure Data Factory (ADF), Azure Synapse Pipelines, Databricks, Azure Logic Apps, and Azure Function Apps; · Developed PySpark code within Databricks notebooks to process and analyze large-scale data sets, benefiting from the distributed computing capabilities of Apache Spark; · Leveraged the C# programming language to create custom code in script tasks for ETL processes, ensuring efficient data integration. Maintained and updated ETL packages, implementing high-performance optimizations to enhance overall processing speed and efficiency; · Created data models using Erwin, Lucid chart and Microsoft Visio to visualize data structures and ensure data integrity; · Extracted data from various sources, including SQL Azure database, SQL Azure Data Warehouse, and AWS RDS solutions like MySQL, Aurora, and Redshift, for use as data sources in Power BI reports; · Collaborated on the design, development, maintenance, and delivery of new Power BI reports, encompassing various types such as Scorecard, Summary, E2E, KTLO, and others. Implemented complex DAX queries for measures, columns, drill-down, drill-through, hyperlink options, dynamic slicing and dicing features, custom tooltips, and Key Performance Indicators (KPIs); · Successfully conducted the migration of dashboards and reports from Tableau to Power BI, ensuring a seamless transition. Migrated reports that drew data from a variety of sources, such as Excel, SharePoint List, SQL Server, Galaxy System DB2, and SSAS Multidimensional Cubes.

Walmart | Dec 2019 - Mar 2021

Data Analyst

· Collaborated with data scientists and analysts to extract meaningful information from raw data, helping drive data-driven business strategies and informed decision-making processes; · Methodically analyzed and interpreted business and data transformation rules, enabling the efficient processing of data; · Proficiently crafted complex stored procedures, triggers, user-defined functions (UDF), indexes, tables, views, and other T-SQL code, including SQL joins, tailored for SSIS packages and Tableau reports; · Implemented Amazon Redshift for high-performance data warehousing, optimizing query performance and facilitating advanced analytics on large datasets; · Effectively utilized Amazon S3 (Simple Storage Service) for secure and scalable object storage, managing and retrieving data assets for various applications and use cases; · Leveraged AWS Athena to perform interactive and ad-hoc querying of data stored in Amazon S3, simplifying data analysis and exploration; · Implemented AWS Glue for ETL (Extract, Transform, Load) processes, automating data preparation and integration tasks, ensuring data quality and accessibility for analytics and reporting purposes; · Proficiently utilized Python, along with libraries like NumPy, Pandas, and Matplotlib, within Jupyter Notebook environments on AWS SageMaker for conducting exploratory data analysis (EDA) tasks; · Employed NumPy to perform efficient numerical operations and manipulate data arrays, enhancing data preprocessing and analysis capabilities; · Leveraged Pandas for data manipulation, cleaning, and transformation, enabling structured data exploration and feature engineering during the EDA process; · Proficiently utilized Tableau for data visualization and analysis, creating interactive and informative dashboards and reports to convey insights; · Conducted data exploration and transformation within Tableau, preparing data sources for analysis by cleaning, aggregating, and structuring datasets; · Collaborated with cross-functional teams to gather requirements and define key performance indicators (KPIs) for Tableau dashboards, ensuring alignment with business objectives.

Uber Technologies, Inc | Aug 2015 - Nov 2019

SQL/ETL Developer

· Designed and executed intricate queries, reports, and dashboards using data visualization tools such as Power BI, Cognos Analytics, MS Excel, and ServiceNow's Performance Analytics. · Implemented robust data validation and integrity checks via T-SQL to ensure the accuracy and consistency of data; · Conducted performance tuning and optimization of T-SQL queries to enhance query response times and resource utilization; · Designed and maintained Extract, Transform, Load (ETL) processes using SQL, enabling the extraction, transformation, and loading of data from diverse sources into the data warehouse; · Created efficient SSIS Data Flow tasks to streamline the movement and transformation of data between source and destination systems; · Integrated SSIS packages with third-party systems, databases, and APIs to automate data imports and exports; · Designed and constructed multidimensional and tabular models in SSAS to support Online Analytical Processing (OLAP) and data mining functionalities; · Generated calculated measures, calculated members, and Key Performance Indicators (KPIs) in SSAS to enable advanced data analysis; · Contributed to the design, development, debugging, and testing of reports within SQL Server Reporting Services (SSRS). Developed SSRS reports tailored to meet specific business requirements, delivering actionable insights to stakeholders; · Implemented parameterized reports in SSRS, empowering users to customize and filter data according to their unique needs.

My Education

Take a look at my formal education

Bachelors in Computer ScienceGubkin Russian State University | 2015

About Me

Simon Akhmedov

Experienced Senior Data Engineer with over 8 years of expertise in end-to-end data pipeline management, including database design, maintenance, ETL, data validation, and report generation.

Powered by

  • Work
  • Certifications
  • Courses
  • Experience
  • Education
  • About Me
  • Create Your Data Portfolio for Free