Summary
Overview
Work History
Education
Skills
Side Projects
Profile
Timeline
CustomerServiceRepresentative

Edgar Morel

Cloud Data Engineer

Summary

He brings over a decade of experience in data analysis and engineering, with a strong focus on aiding organizations in rolling out holistic analytics solutions. His expertise encompasses the entire data lifecycle—from extraction, storage, and cleansing, to preparation and visualization, facilitating informed decision-making processes.

In recent years, He has pivoted towards cloud and distributed technologies, guiding multiple firms through the transition from legacy systems to innovative cloud-based solutions.

Throughout his career, he has empowered numerous clients to achieve their objectives through effective analytics platform solutions.

Overview

14
14
years of professional experience

Work History

Senior Data Engineer

MedImpact (MedImpact Healthcare Systems Company - US)
San Diego
01.2024 - 01.2025

Project: Enhancement of analytical platform

  • Migration of key business data across multiple environments, ensuring data accuracy and consistency.
  • Automated migration workflows using Python, Oracle PL/SQL, SQL, and Java, preserving complex data relationships.
  • Developed solutions with Oracle APEX for software development, data analysis, and reporting, enhancing decision-making processes.
  • Optimization of queries through in-depth analysis, significantly improving overall efficiency.

Technologies used: Oracle, SQL, PL/SQL, Python, Git, Java, APEX.
Project Outcome: Enhancement of centralized data warehouse for optimizing and speeding report retrieval and data migration of key production tables.

Data Engineer Lead

Bix Paraguay
Asuncion,Paraguay
04.2023 - 12.2023

Project : Analytics Platform for Cervantes Institute - Spanish Government

  • Led ETL development at the Cervantes Institute (IC), enhancing data processes across departments
  • Managed ETL processes for On-Prem legacy databases in Academic, Culture, Administration, Accounting, and Marketing departments
  • Developed an automated PySpark-SQL-YAML-based ETL module that reads from Hadoop HDFS files, creates Dimensions and Fact Tables, constructs DataMarts, handles schemas automatically, simplifies SQL maintenance, tracks data lineage, utilizes the Great Expectations and Pytest libraries for QA automation and logging, generates logs for ETL tracking, and simplifies documentation
  • Created a Golden Record Module using PySpark and Python, and implemented Master Data Management (MDM) to ensure data consistency, accuracy, and usability across all areas of the Cervantes Institute.

Technologies used: Cloudera,CDH,Apache Nifi,PySpark, Python, SQL, MSSQL,Oracle 11g,Hadoop,HDFS,Hive,PowerBI,Apache Atlas,Trino,IBM Cloud Pak for Data,Iceberg,Presto/Trino, Avro,Parquet, QA: PyTest, Great Expectations (GE)

Project Outcome: A Data Analytics Platform that is used by more than 300 people on Cervantes Institute offices on more than 40 countries and 100 centres.

ETL Developer - Data Engineer

Surest/Bind (United Healthcare Company)
Minneapolis
02.2020 - 03.2023

Project : ETL development, Data Warehouse maintenance and Data Mart development.

  • Developed and maintained large-scale data pipelines using Databricks, PySpark, and Delta Lake to process over 100 TB of data.
  • Implemented robust ETL processes, data quality checks (Great Expectations), and QA automation (PyTest).
    Migrated legacy pipelines and functions to Azure Data Factory, integrating with Azure services like Blob Storage and Data Lake Gen 2.
  • Optimized job parallelization using Azure Batch, ensuring efficient data processing.
  • Collaborated in agile environments (Scrum/Kanban) and produced reports with Power BI for data analytics and decision-making.

Technologies used: AWS, AWS Glue, Hive, Trino, Starburst, Delta Lake, Python, PySpark, Databricks, Ascend.io, Power BI, PyTest, Great Expectations (GE), Azure Data Factory (ADF), Azure Functions, Azure Batch, Azure Blob Storage, Azure Data Lake Storage Gen 2.

Project Outcome: Creation and maintenance of pipelines that ingests more than 100 Terabytes of data from multiple sources.

Senior Data Engineer

Netel (Payment Gateway, A fintech company)
Asuncion, Paraguay
01.2019 - 02.2020

Project: Data Warehouse maintenance and Data Mart development.

  • ETL development from multiple sources (DBMS, API, Web)
  • Reporting solution development using Oracle PostgreSQL, Excel, and Tableau
  • Machine Learning models for Antichurn and Recommendation systems.

Technologies used: Oracle, MSSQL, PostgreSQL,Tableau, QGIS,R,Python,SQL

Project Outcome: Development of new Data Marts, machine learning models, and analytical tools for use by senior management and the marketing department.

Data Engineer

TIGO (Telecom Company)
Asuncion, Paraguay
11.2014 - 01.2019

Project: Data Warehouse maintenance and Data Mart development.

  • ETL process from multiple sources (DBMS,Web)
  • Data modeling and implementation of anti-churn campaigns
  • Customer segmentation using machine learning techniques
  • Go To Market expansion using Hot Spot Analysis with MapInfo Pro, SQL, and PostGIS
  • Communication of findings and results to Management and Directors
  • Participated in migration of On Premise Data Warehouse to Off Premise Data Lake.

Technologies used: Oracle, Python, R, SQL, MapInfo, PostGis, Excel, Tableau

Project Outcome: Development of analytical tools adopted by over 50 users across senior management and commercial sectors.

ETL Developer - Data Engineer

Personal (Telecom Company)
Asuncion, Paraguay
11.2013 - 11.2014

Project: Data Warehouse maintenance and ETL monitoring

  • Maintenance of existing pipelines
  • Monitor of existing pipelines.

Technologies used: Oracle 11g, SSIS, MSSQL , PLSQL, Excel


Project Outcome: Maintenance of data pipelines serving over 30 stakeholders, including senior management and commercial departments.

Oracle Developer

Kaneko (Pharmaceutical Company)
Asuncion, Paraguay
01.2011 - 11.2013

Project: ERP maintenance

  • Maintenance of ERP modules
  • Creation of new features for ERP.

Technologies used: Oracle 11g, PLSQL, Excel, Java Web

Project Outcome: Development of an analytical platform utilized by 10 professionals in commercial and purchasing departments.

Education

Bachelor Degree in Computer Science -

Universidad Nacional de Asuncion

Quality Engineering Training -

Mentormate

Skills

  • Python: 10 years
  • Oracle: 12 years
  • PL/SQL: 12 years
  • SQL: 12 years
  • Databricks: 5 years
  • PySpark: 5 years
  • AWS: 4 years
  • Azure Data Factory: 4 years
  • Power BI: 1 year
  • Tableau: 2 years
  • Great Expectations (GE): 4 years
  • PyTest: 4 years
  • PostgreSQL: 6 years
  • MSSQL: 10 years
  • R: 5 years
  • Hadoop: 4 years
  • Hive: 4 years
  • Trino/ Presto: 2 years
  • Delta Lake: 4 years
  • Apache Nifi: 1 year
  • GIS: 5 years

Side Projects

CTO and Founder: Datanous

1.- Company name: Big Savings (Retail Company)
Project: Implementation and Cloud migration of Analytics Platform
• Led the ETL process and implemented Oracle BI Datawarehouse.
• Also led the ETL migration from Oracle BI solution to SSIS packages, MSSQL, and Power BI solution.
• Maintained ETL solutions and performed data analysis and data visualization with Oracle BI and Power BI.

Project Outcome: Developed an analytics solution now in active use by senior management and commercial areas for strategic decisions, aiding the company's successful migration from legacy to modern distributed technologies.


2.- Company name: TDP (Technology for the Development of Paraguay - A fintech company)
Project name: Creation of an analytic platform
• Led the creation of an analytics platform.
• ETL data from multiple sources.
• Created a data warehouse on PostgreSQL following medallion architecture principles.
• Conducted quality engineering with Great Expectations and PyTest.
• Visualized data using Power BI and Excel.

Project Outcome: Successful creation of an analytics solution for the company that is currently in production and actively utilized by senior management for strategic decision-making.

3.- Company name: TDP (Technology for the Development of Paraguay - A fintech company)
Project name: Web Application Development for logistic and trade marketing
• Implemented a web application using Flask, Docker, Nginx, and PostgreSQL.
• Utilized PostGIS for spatial data and OSRM for shortest path calculations.
• Managed containerized deployments and reverse proxy configuration.
• Integrated SSL/TLS for secure connections.
• Deployed and managed cloud infrastructure.
• Developed RESTful APIs for geospatial queries.
• Optimized application and database performance.
• Provided technical support and created documentation.
Technologies used: Flask, Docker, Nginx, PostgreSQL, PostGIS, OSRM, Git, SSL/TLS

Project Outcome: Developed a robust, secure web application with advanced GIS capabilities for route optimization and spatial analysis.

Profile

Professional Background:
     - Software Engineer with more than 11 years of experience in data engineering and software development.
     - Expertise in building and maintaining automated data extraction tools from multiple sources.
Technical Skills and Achievements:
     - Proficient in database management and data reorganization for readability.
     - Skilled in performing data analysis using statistical tools to assess data quality and meaning.
     - Capable of interpreting patterns and trends in datasets for diagnostic and predictive purposes.
     - Experienced in preparing management reports highlighting trends, patterns, and predictions.
Early Career:
     - Initially focused on software development projects using Java and web technologies.
     - Developed a reporting platform for a pharmaceutical company's directives.
     - Held responsibilities in project and team management.
Career Evolution:
     - Transitioned to Data Engineering, following a passion for data analysis and engineering. 

      - Enabled numerous clients to attain success via analytics platform solutions.

Timeline

Senior Data Engineer

MedImpact (MedImpact Healthcare Systems Company - US)
01.2024 - 01.2025

Data Engineer Lead

Bix Paraguay
04.2023 - 12.2023

ETL Developer - Data Engineer

Surest/Bind (United Healthcare Company)
02.2020 - 03.2023

Senior Data Engineer

Netel (Payment Gateway, A fintech company)
01.2019 - 02.2020

Data Engineer

TIGO (Telecom Company)
11.2014 - 01.2019

ETL Developer - Data Engineer

Personal (Telecom Company)
11.2013 - 11.2014

Oracle Developer

Kaneko (Pharmaceutical Company)
01.2011 - 11.2013

Bachelor Degree in Computer Science -

Universidad Nacional de Asuncion

Quality Engineering Training -

Mentormate
Edgar Morel