FinOps Data Engineer
Solveify Tech
15 LPA
Location: Remote
Posted: December 22, 2025
Posted By: System Administrator
Job Description
We are seeking a FinOps Data Engineer who will bridge the gap between cloud financial operations and engineering. This role involves designing and implementing ETL pipelines, building cost dashboards, ensuring tagging compliance, detecting anomalies, and collaborating with stakeholders to optimize cloud spend. The ideal candidate will have strong experience in DevOps, Data Engineering, and Cloud Cost management.
Roles & Responsibilites-
• Design and develop ETL solutions for cost and usage data using best practices for data warehousing and analytics.
• Analyze cloud cost and usage across AWS, Databricks, and other Cloud & On-Prem platforms.
• Build and maintain cloud cost dashboards and reporting solutions for visibility across LOBs and programs.
• Implement tagging standards and establish compliance checks and generate reports to ensure adherence to tagging standards.
• Detect & Analyze cost anomalies and usage patterns; proactively identify optimization opportunities using AWS Cost explorer, Databricks System tables & backup tables as driver.
• Collaborate with stakeholders (DevOps, Application teams, Finance, Architect, Infra) to implement cost-saving strategies following FinOps Foundation standards.
• Develop automated workflows for data ingestion, transformation, and validation.
• Document processes, data flows, and standards for FinOps operations.
• Work with vendors and internal teams to ensure KPIs for cost and tagging compliance are met.
• Enable accurate showback/chargeback models aligned to LOB/Program.
• Support forecast vs. actual reporting and provide monthly FinOps insights by enabling automated workflows, alerts, notifications, guardrails.
• Work with DevOps teams on cluster governance, resource control, policy enforcement, guardrails, build cost validation queries, build granular user level cost at tags like LOBs, data products, sessions and roll up to workspace level cost.
• Manage pipelines for ETL jobs, infrastructure automation, and monitoring tools.
• Implement cost-aware DevOps practices (auto-scaling, scheduling, workload orchestration).
• Collaborate on implementing cluster policies, SQL warehouse governance, and operational efficiency.
• Have deep working knowledge of on-prem & cloud ESB architecture to address the client’s requirements for scalability, reliability, security, and performance.
• Provide technical assistance in identifying, evaluating, and developing systems and procedures.
• Manage foundational data administration tasks such as scheduling jobs, troubleshooting job errors, identifying issues with job windows, assisting with Database backups and performance tuning.
• Design, Develop, Test, Adapt ETL code & jobs to accommodate changes in source data and new business requirements.
• Proactively communicate innovative ideas, solutions, and capabilities over and above the specific task request
• Effectively communicate status, workloads, offers to assist other areas.
• Collaboratively work with a team and independently. Continuously strive for high performing business solutions
Competencies & Experience Required/Desired
• 6+ years in Data Engineering with strong ETL design, development, and optimization experience.
• Hands-on experience with AWS services and cost management tools.
• Strong knowledge of tagging strategies and governance in multi-cloud environments.
• Proficiency in SQL, PL/SQL, and data warehousing best practices.
• Experience with DevOps practices, CI/CD pipelines, and automation tools.
• Familiarity with FinOps principles and cloud cost optimization techniques.
• Ability to analyze large datasets and detect anomalies using scripting or BI tools (e.g., Power BI, Tableau).
• Excellent communication skills to work with technical and business stakeholders.
• Strong problem-solving capabilities. Results oriented. Relies on fact-based logic for decision-making.
• Ability to work with multiple projects and work streams at one time. Must be able to deliver results based upon project deadlines.
• Willing to flex daily work schedule to allow for time-zone differences for global team communications.
• Strong interpersonal and communication skills
MOTIVATIONAL/CULTURAL FIT
• Experience with other ETL tools/services
• Data integration experience using platforms.
• Experience in visualization tools, Tableau, PowerBI or other tools.
• Experience in developing basic data science models using Python or a similar language
• Leadership qualities and mentoring others on team.
• Having AWS certification is a plus
Application Stats
Total Applications: 0
Posted: Dec 22, 2025