Job Title: Data Engineer
Location: Charlotte, NC – hybrid onsite 3 times a week
Salary Range: $120,000 to $140,000 per year
*Permanent, direct hire opportunity – not open to 3rd parties*
Overview:
- This role focuses on designing, developing, and supporting modern data solutions across a cloud-based data stack. The ideal candidate is highly self-motivated, resourceful, and capable of building scalable data pipelines, frameworks, and full stack data applications leveraging Snowflake, Python, SQL, and modern DevOps practices.
Key Responsibilities:
- Design, build, and optimize robust data pipelines and workflow processes
- Develop and maintain scalable data models and warehouse architecture
- Create backend services and data-driven applications across the stack
- Implement and manage cloud-native data solutions leveraging Snowflake
- Write clean, efficient, and maintainable code in Python and SQL
- Establish and maintain automated testing practices for data systems
- Develop reusable components and frameworks to accelerate data engineering efforts
- Set up and support CI/CD processes using modern version control tools (e.g., GitHub)
- Deliver data visualization and reporting solutions using tools such as Power BI
- Monitor system performance and troubleshoot issues to ensure data accuracy and availability
Required Skills:
- Hands-on experience with Snowflake (typically 3+ years)
- Extensive experience with Python and SQL (generally 7+ years each)
- Proven background in full stack or backend-focused development
- Experience designing and optimizing data warehouse solutions
- Familiarity with automated testing and building engineering frameworks
- Experience working with CI/CD pipelines and version control systems
- Exposure to cloud platforms, with preference for Azure environments
- Bachelor’s degree in a related field or equivalent professional experience
Preferred Skills:
- Experience with orchestration tools such as Dagster
- Familiarity with dbt for data transformations
- Knowledge of optimization tools like Gurobi
- Experience with data validation techniques and tooling
- Exposure to Retool or similar internal app platforms
- Familiarity with managed file transfer solutions (e.g., GoAnywhere)
- Experience building dashboards using Power BI and DAX
- Hands-on experience with Azure services (ADF, Synapse, Key Vault)
- Exposure to machine learning capabilities in Snowflake or use of R
- Experience with SQL Server and related tools (SSIS, SSRS)
- Background in financial services or similar regulated industries
- Experience working in Agile development environments
If interested in the role, please apply and email your most updated resume to Liam at [email protected].
