At Globe, our goal is to create a wonderful world for our people, business, and nation. By uniting people of passion who believe they can make a difference, we are confident that we can achieve this goal.
Build and maintain robust data platforms for operational use cases.
Design and implement scalable, reusable data pipelines, defining frameworks, and leading data-related projects to support the company's operational needs.
Develop Data Platform by leading the design, development, and maintenance of a scalable data platform to support various operational data use cases.
Define and establish frameworks governance and best practices for data pipeline design, development, and deployment, ensuring efficiency and scalability.
Create architecture, design, and implement reusable and efficient data pipelines for ingesting, processing, and transforming data from diverse sources into formats ready for operational use, while adhering to best practices in automation and optimization.
Provide accurate level-of-effort estimates for new initiatives and change requests, facilitating effective project evaluation, budgeting, and tracking for success.
Develop proofs-of-concept, minimum viable products (MVPs), prototypes, and product demos to showcase the feasibility and potential of new data solutions.
Utilize project management methodologies and tools to plan, execute, and monitor data projects, ensuring timely delivery and alignment with business objectives.
Stay up-to-date with emerging data technologies, frameworks, and tools, and assess their potential impact on the organization’s data strategy and operations.
Ensure that data security best practices are integrated into all aspects of data platform development and pipeline management.
Support in designing data pipelines, data lakes, warehouses, and machine learning solutions to power other business units with data.
Build, and produce scaled data pipelines, lakehouses, and machine learning infrastructure on AWS or GCP.
Maintain efficient, accurate and safe data storage and flow, and right-size data infrastructure resources based on usage analytics and workload profiling.
Assess operationalization of technologies like distributed computing, streaming analytics, and cloud-native data patterns in ISG’s environment.
Collaborate with Cloud & Infra Engineers to implement automation and policies to eliminate waste, stop unused resources, increase efficiency.
Troubleshoot and resolve data-related issues
Throughout your day, you’ve tackled challenges in data integration, optimization, and architecture, all while ensuring that your Globe’s data remains accurate, accessible, and secure. Your work forms the backbone of our data-centric processes.
Strong background in programming languages like Python, SQL and Java
In-depth knowledge of Data Engineering tools is preferred (Airflow, NiFi, Dataflow, AWS Glue)
Experience in ETL development and management
Experience in Data warehousing solutions (Snowflake, Redshift, BigQuery)
Experience with Cloud Platforms (AWS and GCP)
In-depth experience in automated data pipeline design, implementation and support
Experience in Data Processing, Data Tools, Data modeling
Familiarity with GitOps Principles
Experience in DevSecOps Processes
Understanding of data engineering design and development best practices
Understanding of data management best practices
Experience: 5+ years of hands-on experience as a Data Engineer, in designing and implementing data solutions and data pipelines.
Problem-Solving Mindset: Analytical, with the ability to take on abstract business challenges, and design practical and scalable solutions.
Strong Communication Skills: Comfortable in translating technical requirements into clear, concise business objectives.
Team Leadership: Experienced in guiding cross-functional teams to success in a fast-paced, agile environment.
Make Your Passion Part of Your Profession. Attracting the best and brightest Talents is pivotal to our success. If you are ready to share our purpose of Creating a Globe of Good, explore opportunities with us.