Home
Jobs
Companies
Resume

607 Dataflow Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

1 - 4 Lacs

Hyderābād

On-site

Job Title: Senior Data Analyst – AdTech (Team Lead) Location: Hyderabad Experience Level: 4–6 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About the Role: We are looking for a highly experienced and hands-on Senior Data Analyst (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling , cross-platform data engineering , and a solid command of modern BI tools . You'll play a key role in building scalable data pipelines, leading analytics strategy, and mentoring a team of analysts. Key Responsibilities: Lead and mentor a team of data analysts, ensuring quality delivery and technical upskilling. Design, develop, and maintain scalable ETL/ELT pipelines using GCP tools (BigQuery, Dataflow, Cloud Composer, Cloud Functions, Pub/Sub). Ingest and process log-level data from platforms like Google Ad Manager, Google Analytics (GA4/UA), DV360, and other advertising and marketing tech sources. Build and optimize data pipelines from diverse sources via APIs, cloud connectors, and third-party tools (e.g., Supermetrics, Fivetran, Stitch). Integrate and manage data across multiple cloud platforms and data warehouses such as BigQuery, Snowflake, DOMO, and AWS (Redshift, S3). Own the creation of data models, data marts, and analytical layers to support dashboards and deep-dive analyses. Build and maintain scalable, intuitive dashboards using Looker Studio, Tableau, Power BI, or Looker. Partner with engineering, product, revenue ops, and client teams to gather requirements and drive strategic insights from data. Ensure data governance, security, and quality standards are followed across the analytics ecosystem. Required Qualifications: 4–6 years of experience in data analytics or data engineering roles, with at least 1–2 years in a leadership capacity. Deep expertise working with log-level AdTech data—Google Ad Manager, Google Analytics, GA4, programmatic delivery logs, and campaign-level data. Strong knowledge of SQL and Google BigQuery for large-scale data querying and transformation. Hands-on experience building data pipelines using GCP tools (Dataflow, Composer, Cloud Functions, Pub/Sub, Cloud Storage). Proven experience integrating data from various APIs and third-party connectors. Experience working with multiple data warehouses: Snowflake, DOMO, AWS Redshift, etc. Strong skills in data visualization tools: Looker Studio, Tableau, Power BI, or Looker. Excellent stakeholder communication and documentation skills. Preferred Qualifications: Scripting experience in Python or JavaScript for automation and custom ETL development. Familiarity with version control (e.g., Git), CI/CD pipelines, and workflow orchestration. Exposure to privacy regulations and consent-based data handling in digital advertising (GDPR, CCPA). Experience working in agile environments and managing delivery timelines across multiple stakeholders.

Posted 5 days ago

Apply

1.0 years

1 - 4 Lacs

Hyderābād

On-site

Job Title: Data Analyst – AdTech (1+ Years Experience) Location: Hyderabad Experience Level: 2–3 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About the Role: We are looking for a highly motivated and detail-oriented Data Analyst with 1+ years of experience to join our AdTech analytics team. In this role, you will be responsible for working with large-scale advertising and digital media datasets, building robust data pipelines, querying and transforming data using GCP tools, and delivering insights through visualization platforms like Looker Studio, Looker, Tableau etc Key Responsibilities: Analyze AdTech data (e.g., ads.txt, programmatic delivery, campaign performance, revenue metrics) to support business decisions. Design, develop, and maintain scalable data pipelines using GCP-native tools (e.g., Cloud Functions, Dataflow, Composer). Write and optimize complex SQL queries in BigQuery for data extraction and transformation. Build and maintain dashboards and reports in Looker Studio to visualize KPIs and campaign performance. Collaborate with cross-functional teams including engineering, operations, product, and client teams to gather requirements and deliver analytics solutions. Monitor data integrity, identify anomalies, and work on data quality improvements. Provide actionable insights and recommendations based on data analysis and trends. Required Qualifications: 1+ years of experience in a data analytics or business intelligence role. Hands-on experience with AdTech datasets and understanding of digital advertising concepts. Strong proficiency in SQL, particularly with Google BigQuery. Experience building and managing data pipelines using Google Cloud Platform (GCP) tools. Proficiency in Looker Studio Strong problem-solving skills and attention to detail. Excellent communication skills with the ability to explain technical topics to non-technical stakeholders. Preferred Qualifications: Experience with additional visualization tools such as Tableau, Power BI, or Looker (BI). Exposure to data orchestration tools like Apache Airflow (via Cloud Composer). Familiarity with Python for scripting or automation. Understanding of cloud data architecture and AdTech integrations (e.g., DV360, Ad Manager, Google Ads).

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

CloudWerx is looking for a dynamic SENIOR ENGINEER, DATA to become a vital part of our vibrant DATA ANALYTICS & ENGINEERING TEAM , working in HYDERABAD, INDIA . Join the energy and come be part of the momentum! As a Senior Cloud Data Engineer you will be at the forefront of cloud technology, architecting and implementing cutting-edge data solutions that drive business transformation. You'll have the opportunity to work with a diverse portfolio of clients, from innovative startups to industry leaders, solving complex data challenges using the latest GCP technologies. This role offers a unique blend of technical expertise and client interaction, allowing you to not only build sophisticated data systems but also to consult directly with clients, shaping their data strategies and seeing the real-world impact of your work. If you're passionate about pushing the boundaries of what's possible with cloud data engineering and want to be part of a team that's shaping the future of data-driven decision making, this is your chance to make a significant impact in a rapidly evolving field. Our goal is to have a sophisticated team equipped with expert technical skills in addition to keen business acumen. Each member of our team adds unique value to the business and the customer. CloudWerx is committed to a culture where we attract the best talent in the industry. We aim to be second-to-none when it comes to cloud consulting and business acceleration. This is an incredible opportunity to get involved in an engineering-focused cloud consulting company that provides the most elite technology resources to solve the toughest challenges. Each member of our team adds unique value to the business and the customer. CloudWerx is committed to a culture where we attract the best talent in the industry. We aim to be second-to-none when it comes to cloud consulting and business acceleration. This role is a full-time opportunity in our Hyderabad Office. INSIGHT ON YOUR IMPACT Lead technical discussions with clients, translating complex technical concepts into clear, actionable strategies that align with their business goals. Architect and implement innovative data solutions that transform our clients' businesses, enabling them to harness the full power of their data assets. Collaborate with cross-functional teams to design and optimize data pipelines that process petabytes of data, driving critical business decisions and insights. Mentor junior engineers and contribute to the growth of our data engineering practice, fostering a culture of continuous learning and innovation. Drive the adoption of cutting-edge GCP technologies, positioning our company and clients at the forefront of the cloud data revolution. Identify opportunities for process improvements and automation, increasing the efficiency and scalability of our consulting services. Collaborate with sales and pre-sales teams to scope complex data engineering projects, ensuring technical feasibility and alignment with client needs. YOUR QUALIFICATION, YOUR INFLUENCE To be successful in the role, you must possess the following skills Proven experience (typically 4-8 years) in data engineering, with a strong focus on Google Cloud Platform technologies. Deep expertise in GCP data services, particularly tools like BigQuery, Cloud Composer, Cloud SQL, and Dataflow, with the ability to architect complex data solutions. Strong proficiency in Python and SQL, with the ability to write efficient, scalable, and maintainable code. Demonstrated experience in data modeling, database performance tuning, and cloud migration projects. Excellent communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders. Proven ability to work directly with clients, understanding their business needs and translating them into technical solutions. Strong project management skills, including experience with Agile methodologies and tools like Jira. Ability to lead and mentor junior team members, fostering a culture of knowledge sharing and continuous improvement. Track record of staying current with emerging technologies and best practices in cloud data engineering. Experience working in a consulting or professional services environment, with the ability to manage multiple projects and priorities. Demonstrated problem-solving skills, with the ability to think creatively and innovatively to overcome technical challenges. Willingness to obtain relevant Google Cloud certifications if not already held. Ability to work collaboratively in a remote environment, with excellent time management and self-motivation skills. Cultural sensitivity and adaptability, with the ability to work effectively with diverse teams and clients across different time zones. Our Diversity and Inclusion Commitment At CloudWerx, we are dedicated to creating a workplace that values and celebrates diversity. We believe that a diverse and inclusive environment fosters innovation, collaboration, and mutual respect. We are committed to providing equal employment opportunities for all individuals, regardless of background, and actively promote diversity across all levels of our organization. We welcome all walks of life, as we are committed to building a team that embraces and mirrors a wide range of perspectives and identities. Join us in our journey toward a more inclusive and equitable workplace. Background Check Requirement All candidates for employment will be subject to pre-employment background screening for this position. All offers are contingent upon the successful completion of the background check. For additional information on the background check requirements and process, please reach out to us directly. Our Story CloudWerx is an engineering-focused cloud consulting firm born in Silicon Valley - in the heart of hyper-scale and innovative technology. In a cloud environment we help businesses looking to architect, migrate, optimize, secure or cut costs. Our team has unique experience working in some of the most complex cloud environments at scale and can help businesses accelerate with confidence. Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Senior Data Analyst – AdTech (Team Lead) Location: Hyderabad Experience Level: 4–6 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About The Role We are looking for a highly experienced and hands-on Senior Data Analyst (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling , cross-platform data engineering , and a solid command of modern BI tools . You'll play a key role in building scalable data pipelines, leading analytics strategy, and mentoring a team of analysts. Key Responsibilities Lead and mentor a team of data analysts, ensuring quality delivery and technical upskilling. Design, develop, and maintain scalable ETL/ELT pipelines using GCP tools (BigQuery, Dataflow, Cloud Composer, Cloud Functions, Pub/Sub). Ingest and process log-level data from platforms like Google Ad Manager, Google Analytics (GA4/UA), DV360, and other advertising and marketing tech sources. Build and optimize data pipelines from diverse sources via APIs, cloud connectors, and third-party tools (e.g., Supermetrics, Fivetran, Stitch). Integrate and manage data across multiple cloud platforms and data warehouses such as BigQuery, Snowflake, DOMO, and AWS (Redshift, S3). Own the creation of data models, data marts, and analytical layers to support dashboards and deep-dive analyses. Build and maintain scalable, intuitive dashboards using Looker Studio, Tableau, Power BI, or Looker. Partner with engineering, product, revenue ops, and client teams to gather requirements and drive strategic insights from data. Ensure data governance, security, and quality standards are followed across the analytics ecosystem. Required Qualifications 4–6 years of experience in data analytics or data engineering roles, with at least 1–2 years in a leadership capacity. Deep expertise working with log-level AdTech data—Google Ad Manager, Google Analytics, GA4, programmatic delivery logs, and campaign-level data. Strong knowledge of SQL and Google BigQuery for large-scale data querying and transformation. Hands-on experience building data pipelines using GCP tools (Dataflow, Composer, Cloud Functions, Pub/Sub, Cloud Storage). Proven experience integrating data from various APIs and third-party connectors. Experience working with multiple data warehouses: Snowflake, DOMO, AWS Redshift, etc. Strong skills in data visualization tools: Looker Studio, Tableau, Power BI, or Looker. Excellent stakeholder communication and documentation skills. Preferred Qualifications Scripting experience in Python or JavaScript for automation and custom ETL development. Familiarity with version control (e.g., Git), CI/CD pipelines, and workflow orchestration. Exposure to privacy regulations and consent-based data handling in digital advertising (GDPR, CCPA). Experience working in agile environments and managing delivery timelines across multiple stakeholders. Show more Show less

Posted 5 days ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Analyst – AdTech (1+ Years Experience) Location: Hyderabad Experience Level: 2–3 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About The Role We are looking for a highly motivated and detail-oriented Data Analyst with 1+ years of experience to join our AdTech analytics team. In this role, you will be responsible for working with large-scale advertising and digital media datasets, building robust data pipelines, querying and transforming data using GCP tools, and delivering insights through visualization platforms like Looker Studio, Looker, Tableau etc Key Responsibilities Analyze AdTech data (e.g., ads.txt, programmatic delivery, campaign performance, revenue metrics) to support business decisions. Design, develop, and maintain scalable data pipelines using GCP-native tools (e.g., Cloud Functions, Dataflow, Composer). Write and optimize complex SQL queries in BigQuery for data extraction and transformation. Build and maintain dashboards and reports in Looker Studio to visualize KPIs and campaign performance. Collaborate with cross-functional teams including engineering, operations, product, and client teams to gather requirements and deliver analytics solutions. Monitor data integrity, identify anomalies, and work on data quality improvements. Provide actionable insights and recommendations based on data analysis and trends. Required Qualifications 1+ years of experience in a data analytics or business intelligence role. Hands-on experience with AdTech datasets and understanding of digital advertising concepts. Strong proficiency in SQL, particularly with Google BigQuery. Experience building and managing data pipelines using Google Cloud Platform (GCP) tools. Proficiency in Looker Studio Strong problem-solving skills and attention to detail. Excellent communication skills with the ability to explain technical topics to non-technical stakeholders. Preferred Qualifications Experience with additional visualization tools such as Tableau, Power BI, or Looker (BI). Exposure to data orchestration tools like Apache Airflow (via Cloud Composer). Familiarity with Python for scripting or automation. Understanding of cloud data architecture and AdTech integrations (e.g., DV360, Ad Manager, Google Ads). Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

On-site

Linkedin logo

Responsibilities As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development & solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: ● Design, develop, and support data pipelines and related data products and platforms. ● Design and build data extraction, loading, and transformation pipelines and data products across on- prem and cloud platforms. ● Perform application impact assessments, requirements reviews, and develop work estimates. ● Develop test strategies and site reliability engineering measures for data products and solutions. ● Participate in agile development and solution reviews. ● Mentor junior Data Engineers. ● Lead the resolution of critical operations issues, including post-implementation reviews. ● Perform technical data stewardship tasks, including metadata management, security, and privacy by design. ● Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies ● Demonstrate SQL and database proficiency in various data engineering tasks. ● Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. ● Develop Unix scripts to support various data operations. ● Model data to support business intelligence and analytics initiatives. ● Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. ● Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: ● Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. ● 4+ years of data engineering experience. ● 2 years of data solution architecture and design experience. ● GCP Certified Data Engineer (preferred). Interested candidates can send their resumes to riyanshi@etelligens.in Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

💼 Senior Backend Developer – Java & GCP 📍 Location: Remote (India) 🕒 Type: Contract (Long-term engagement) We’re looking for a skilled backend engineer with a strong foundation in Java and hands-on experience with Google Cloud technologies to join a high-performing team building cloud-native solutions. 🔧 Key Skills & Requirements ✅ Strong expertise in Java 8 and above ✅ Experience with Python is a big plus ✅ Hands-on experience with Google Cloud Platform (GCP) tools: Dataproc Dataflow BigQuery Pub/Sub ✅ Proficient with containerization technologies : Kubernetes OpenShift Docker ✅ Solid understanding of CI/CD pipelines ✅ Familiarity with observability & monitoring tools like ELK or similar 📌 Why Join? ✔ Work on high-impact, scalable cloud systems ✔ Leverage modern DevOps and GCP practices ✔ 100% remote flexibility Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position: Data Architect Skills: GCP, DA, Development, SQL, Python, Big Query, Dataproc, Dataflow, Data Pipelines. Exp: 10+ Yrs Roles and Responsibilities • 10+ years of relevant work experience, including previous experience leading Data related projects in the field of Reporting and Analytics. • Design, build & maintain scalable data lake and data warehouse in cloud ( GCP ) • Expertise in gathering business requirements, analysing business needs, defining the BI/DW architecture to support and help deliver technical solutions to complex business and technical requirements • Creating solution prototype and participating in technology selection. Perform POC and technical presentations • Architect, develop and test scalable data warehouses and data pipelines architecture in Cloud Technologies ( GCP ) Experience in SQL and No SQL DBMS like MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, MongoDB. • Design and develop scalable ETL processes, including error handling. • Expert in Query and program languages MS SQL Server, T-SQL, PostgreSQL, MY SQL, Python, R. • Preparing data structures for advanced analytics and self-service reporting using MS SQL, SSIS, SSRS • Write scripts for stored procedures, database snapshots backups and data archiving. • Experience with any of these cloud-based technologies: o PowerBI/Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake o AWS RedShift, Glue, Athena, AWS Quicksight o Google Cloud Platform Good to have: • Agile development environment pairing DevOps with CI/CD pipelines • AI/ML background Interested candidates share cv to dikshith.nalapatla@motivitylabs.com Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

WHAT YOU DO AT AMD CHANGES EVERYTHING We care deeply about transforming lives with AMD technology to enrich our industry, our communities, and the world. Our mission is to build great products that accelerate next-generation computing experiences - the building blocks for the data center, artificial intelligence, PCs, gaming and embedded. Underpinning our mission is the AMD culture. We push the limits of innovation to solve the world’s most important challenges. We strive for execution excellence while being direct, humble, collaborative, and inclusive of diverse perspectives. AMD together we advance_ MTS SOFTWARE DEVELOPMENT ENGINEER The Role Performance modelling and evaluation of ACAP workloads to eliminate bottlenecks as early as possible and guide the architecture of future generation devices. This is a challenging role in the FPGA Silicon Architecture Group in AECG business unit of AMD in Hyderabad. About The Team AECG group in AMD designs cutting edge FPGAs and Adaptable SOCs consisting of processor subsystems and associated peripherals, programmable fabric, memory controllers, I/O interfaces and interconnect. Key Responsibilities Modelling and simulation of workload dataflow networks and clock accurate SOC components. Performance analysis and identification of bottlenecks Quick prototyping, long-term design decisions, and exploring novel architectures Enhancement of the existing tools and knowledgebase Collaborating with architects in the development of next generation devices Collaborating with customer facing teams to identify scope of optimization for future market scenarios Breaking down system level designs into simpler dataflow models and identify bottlenecks, capture memory and communication overheads Knowledge sharing with teammates through thorough documentation Preferred Experience Preferred experience in SOC architecture OR Performance analysis. Strong background in Computer architecture, Hardware performance metrics and bottlenecks. Experienced in modelling and simulation of hardware. Experience in performance profiling, creating experiments to address various use-cases and doing design space exploration. Good to have experience of creation of designs for ACAP devices or HLS. Good communication skills Academic Credentials Bachelor’s or Master's degree in Computer Science, Computer Engineering, Electrical Engineering, or equivalent Benefits offered are described: AMD benefits at a glance. AMD does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services. AMD and its subsidiaries are equal opportunity, inclusive employers and will consider all applicants without regard to age, ancestry, color, marital status, medical condition, mental or physical disability, national origin, race, religion, political and/or third-party affiliation, sex, pregnancy, sexual orientation, gender identity, military or veteran status, or any other characteristic protected by law. We encourage applications from all qualified candidates and will accommodate applicants’ needs under the respective laws throughout all stages of the recruitment and selection process. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

GCP Data Engineer Remote Type: Fulltime Rate: Market Client -Telus Required Skills: ● 4+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. ● Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. ● Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. ● Work closely with analysts and business process owners to translate business requirements into technical solutions. ● Coding experience in scripting and languages (Python, SQL, PySpark). ● Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space ( BigQuery , Google Composer, Airflow, CloudSQL, PostgreSQL, Oracle, GCP Workflows , Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, Vertex AI). ● Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. ● Understanding CI/CD Processes using Pulumi, Github, Cloud Build, Cloud SDK, Docker Show more Show less

Posted 5 days ago

Apply

14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Devops Manager Location: Ahmedabad/Hyderabad Exp: 14+ years Experience Required: 14+ years total experience, with 4–5 years in managerial roles. Technical Knowledge and Skills: Mandatory: Cloud: GCP (Complete stack from IAM to GKE) CI/CD: End-to-end pipeline ownership (GitHub Actions, Jenkins, Argo CD) IaC: Terraform, Helm • Containers: Docker, Kubernetes • DevSecOps: Vault, Trivy, OWASP Nice to Have: FinOps exposure for cost optimization Big Data tools familiarity (BigQuery, Dataflow) Familiarity with Kong, Anthos, Istio Scope: Lead DevOps team across multiple pods and products Define roadmap for automation, security, and CI/CD Ensure operational stability of deployment pipelines Roles and Responsibilities: Architect and guide implementation of enterprise-grade CI/CD pipelines that support multi-environment deployments, microservices architecture, and zero downtime delivery practices. Oversee Infrastructure-as-Code initiatives to establish consistent and compliant cloud provisioning using Terraform, Helm, and policy-as-code integrations. Champion DevSecOps practices by embedding security controls throughout the pipeline—ensuring image scanning, secrets encryption, policy checks, and runtime security enforcement Lead and manage a geographically distributed DevOps team, setting performance expectations, development plans, and engagement strategies. • Drive cross-functional collaboration with engineering, QA, product, and SRE teams to establish integrated DevOps governance practices. Develop a framework for release readiness, rollback automation, change control, and environment reconciliation processes. Monitor deployment health, release velocity, lead time to recovery, and infrastructure cost optimization through actionable DevOps metrics dashboards Serve as the primary point of contact for C-level stakeholders during major infrastructure changes, incident escalations, or audits. Own the budgeting and cost management strategy for DevOps tooling, cloud consumption, and external consulting partnerships. Identify, evaluate, and onboard emerging DevOps technologies, ensuring team readiness through structured onboarding, POCs, and knowledge sessions. Foster a culture of continuous learning, innovation, and ownership—driving internal tech talks, hackathons, and community engagement Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Linkedin logo

Job Description It is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Summary Database Engineer/ Developer - Core Skills Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. A good understanding of data security measures and compliance is also required. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes, and knowledge of cloud-based databases like AWS RDS and Google BigQuery. Min 5 years of experience. JD Database Engineer - Data Research Engineering Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

Role: Database Engineer Location : Remote Skills and Experience ● Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. ● Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. ● Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. ● Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. ● Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). ● Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. ● Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. ● Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. ● Knowledge of SQL and understanding of database design principles, normalization, and indexing. ● Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. ● Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. ● Eagerness to develop import workflows and scripts to automate data import processes. ● Knowledge of data security best practices, including access controls, encryption, and compliance standards. ● Strong problem-solving and analytical skills with attention to detail. ● Creative and critical thinking. ● Strong willingness to learn and expand knowledge in data engineering. ● Familiarity with Agile development methodologies is a plus. ● Experience with version control systems, such as Git, for collaborative development. ● Ability to thrive in a fast-paced environment with rapidly changing priorities. ● Ability to work collaboratively in a team environment. ● Good and effective communication skills. ● Comfortable with autonomy and ability to work independently. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Linkedin logo

Role : Database Engineer Location : Remote Notice Period : 30 Days Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 5 days ago

Apply

45.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a highly experienced Senior Data Engineer with strong expertise in Google Cloud Platform (GCP) and exposure to machine learning engineering (MLE) to support high-impact banking initiatives. The ideal candidate will combine hands-on engineering skills, architectural insight, and a proven track record of building secure, scalable, and intelligent data solutions in financial services. Location: Pune, India (Work from Office - Completely onsite) Experience Required: Minimum 45years Position Type: Full-time Start Date: Immediate or as per notice period Key Responsibilities : Provide technical leadership on GCP data projects, collaborating with business, data science, and ML teams. Design and implement scalable data pipelines using GCP tools (BigQuery, Dataflow, Composer, etc.). Support MLOps workflows, including feature engineering and real-time inference. Ensure secure, compliant, and high-quality data infrastructure aligned with banking standards. Optimize BigQuery performance and cost efficiency for large-scale datasets. Enable BI insights using tools like Power BI and Looker. Own the end-to-end data lifecycle across development, deployment, and monitoring. Required Skills: 6–10 years of experience in data engineering; 3+ on GCP. Deep proficiency in GCP services (BigQuery, Dataflow, Composer, Dataproc). Strong Python and SQL skills; familiarity with Terraform and CI/CD tools. Experience supporting ML pipelines and maintaining compliance in regulated environments. Preferred: GCP certifications (Professional Data Engineer / Architect). Familiarity with MLOps (Vertex AI, Kubeflow), financial data domains, and streaming data. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Roles And Responsibilities Proficiency in building highly scalable ETL and streaming-based data pipelines using Google Cloud Platform (GCP) services and products like Biquark, Cloud Dataflow Proficiency in large scale data platforms and data processing systems such as Google Big Query, Amazon Redshift, Azure Data Lake Excellent Python, PySpark and SQL development and debugging skills, exposure to other Big Data frameworks like Hadoop Hive would be added advantage Experience building systems to retrieve and aggregate data from event-driven messaging frameworks (e.g. RabbitMQ and Pub/Sub) Secondary Skills : Cloud Big Table, AI/ML solutions, Compute Engine, Cloud Fusion (ref:hirist.tech) Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsibilities: Develop and maintain data pipelines and ETL/ELT processes using Python Design and implement scalable, high-performance applications Work collaboratively with cross-functional teams to define requirements and deliver solutions Develop and manage near real-time data streaming solutions using Pub, Sub or Beam Contribute to code reviews, architecture discussions, and continuous improvement initiatives Monitor and troubleshoot production systems to ensure reliability and performance Basic Qualifications: 5+ years of professional software development experience with Python Strong understanding of software engineering best practices (testing, version control, CI/CD) Experience building and optimizing ETL/ELT processes and data pipelines Proficiency with SQL and database concepts Experience with data processing frameworks (e.g., Pandas) Understanding of software design patterns and architectural principles Ability to write clean, well-documented, and maintainable code Experience with unit testing and test automation Experience working with any cloud provider (GCP is preferred) Experience with CI/CD pipelines and Infrastructure as code Experience with Containerization technologies like Docker or Kubernetes Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience) Proven track record of delivering complex software projects Excellent problem-solving and analytical thinking skills Strong communication skills and ability to work in a collaborative environment Preferred Qualifications: Experience with GCP services, particularly Cloud Run and Dataflow Experience with stream processing technologies (Pub/Sub) Familiarity with big data technologies (Airflow) Experience with data visualization tools and libraries Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform Familiarity with platforms like Snowflake, Bigquery or Databricks, GCP Data engineer certification We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsibilities: Develop and maintain data pipelines and ETL/ELT processes using Python Design and implement scalable, high-performance applications Work collaboratively with cross-functional teams to define requirements and deliver solutions Develop and manage near real-time data streaming solutions using Pub, Sub or Beam Contribute to code reviews, architecture discussions, and continuous improvement initiatives Monitor and troubleshoot production systems to ensure reliability and performance Basic Qualifications: 5+ years of professional software development experience with Python Strong understanding of software engineering best practices (testing, version control, CI/CD) Experience building and optimizing ETL/ELT processes and data pipelines Proficiency with SQL and database concepts Experience with data processing frameworks (e.g., Pandas) Understanding of software design patterns and architectural principles Ability to write clean, well-documented, and maintainable code Experience with unit testing and test automation Experience working with any cloud provider (GCP is preferred) Experience with CI/CD pipelines and Infrastructure as code Experience with Containerization technologies like Docker or Kubernetes Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience) Proven track record of delivering complex software projects Excellent problem-solving and analytical thinking skills Strong communication skills and ability to work in a collaborative environment Preferred Qualifications: Experience with GCP services, particularly Cloud Run and Dataflow Experience with stream processing technologies (Pub/Sub) Familiarity with big data technologies (Airflow) Experience with data visualization tools and libraries Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform Familiarity with platforms like Snowflake, Bigquery or Databricks, GCP Data engineer certification We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Description Consultant Delivery ( Data Engineer) About Worldline At Worldline, we are pioneers in payments technology, committed to creating innovative solutions that make financial transactions secure, accessible, and seamless worldwide. Our diverse team of professionals collaborates across cultures and disciplines, driving progress that benefits society and businesses of all sizes. We believe that diverse perspectives fuel innovation and are dedicated to fostering an inclusive environment where all individuals can thrive. The Opportunity We are seeking a highly skilled and knowledgeable Data Engineer to join our Data Management team on a transformative Move to Cloud (M2C) project. This role offers a unique opportunity to contribute to a critical initiative, migrating our data infrastructure to the cloud and optimizing our data pipelines for performance and scalability. We welcome applicants from all backgrounds and experiences, believing that our strength lies in our diversity. Technical Skills & Qualifications Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Experience: Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based solutions, preferably within the Google Cloud Platform (GCP) ecosystem. Essential Skills: Strong knowledge of version control systems and CI/CD pipelines. Proficiency in GCP services, particularly DataProc, Dataflow, Cloud Functions, Workflows, Cloud Composer, and BigQuery. Extensive experience with ETL tools, specifically dbt Labs, and a deep understanding of ETL best practices. Proven ability to build and optimize data pipelines, architectures, and datasets from both structured and unstructured data sources. Proficiency in SQL and Python, with experience using Spark. Excellent analytical and problem-solving skills, with the ability to translate complex requirements into technical solutions. Desirable Skills: Relevant certifications in Google Cloud Platform or other data engineering credentials. Preferred Skills Experience migrating data from on-premises data warehouses (e.g., Oracle) to cloud-based solutions. Experience working with large-scale datasets and complex data transformations. Strong communication and interpersonal skills, with the ability to collaborate effectively within a team environment. Why Join Us? At Worldline, we believe that embracing diversity and promoting inclusion drives innovation and success. We foster a workplace where everyone feels valued and empowered to bring their authentic selves. We offer extensive training, mentorship, and development programs to support your growth and help you make a meaningful impact. Join a global team of passionate professionals shaping the future of payments technology—where your ideas, experiences, and perspectives are appreciated and celebrated. Learn more about life at Worldline at Jobs.worldline.com. We are an Equal Opportunity Employer. We do not discriminate based on race, ethnicity, religion, color, national origin, sex (including pregnancy and childbirth), sexual orientation, gender identity or expression, age, disability, or any other legally protected characteristic. We are committed to creating a diverse and inclusive environment for all employees Show more Show less

Posted 5 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 3 years of experience in a technical leadership role; overseeing projects, with 2 years of experience in a people management, supervision/team leadership role. Experience in one or more disciplines such as machine learning, recommendation systems, natural language processing, computer vision, pattern recognition, or artificial intelligence. Preferred qualifications: Understanding of agentic AI/ML and Large Language Model (LLM). Excellent coding skills. About The Job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started -- and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Manage a team of AI software engineers, fostering a collaborative and high-performing environment. This includes hiring, mentoring, performance management, and career development. Drive the design, development, and deployment of scalable and reliable Artificial Intelligence/Machine Learning (AI/ML) systems and infrastructure relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning). Collaborate with Product Managers and HR stakeholders to understand business needs, define product requirements, and translate them into technical specifications and project plans. Oversee the architecture and implementation of data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) to support AI/ML initiatives. Stay up-to-date of the latest advancements in AI/ML and related technologies, evaluating their potential application within human resources and guiding the team's adoption of relevant innovations. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less

Posted 5 days ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 2+ years of software engineering experience 2+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 2+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 2+ years experience designing and developing microservices using Java, Spring Framework, GCP SDKs, GKE/Kubernetes 2+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data Technologies : Spark/Scala/Hadoop What could set you apart Experience designing and developing big data processing solutions using DataProc, Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Cloud Certification especially in GCP Self-starter that identifies/responds to priority shifts with minimal supervision. You have excellent leadership and motivational skills You have an inquisitive and innovative mindset with a shown ability to recognize opportunities to create distinctive value You can successfully evaluate workload to drive efficiency Show more Show less

Posted 5 days ago

Apply

7.0 - 9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Contribute to the release of code and applications. Support IT functions hosting services on Azure & GCP, continuous integration, test automation. Help our team support build and own the implementation and IaC supporting CI/CD Use modern infrastructure tools and platforms to automate our systems. Help define the DevOps roadmap, discover what is needed, define the scope and technologies, and help to build the backlog. Job Qualifications 7-9 years professional work experience Proficient with Cloud based implementations including Azure & GCP Strong knowledge of Ansible, Terraform & IaC scripting Continuous integration (Semaphore, Jenkins), Test Automation DB: PostgreSQL, Snowflake, MySQL Snowflake, Google Dataflow, & PowerBI DevOps knowledge Experience with NGINX, Linux Experience with Windows Server and MS SQL Server 2019 or newer Strong programming and scripting fundamentals (Bash, PowerShell, Visual script) Containers such as Docker Excellent communication skills Strong attention to detail and excellent analytical capabilities A desire to write tools and applications to automate work rather than do everything by hand Passionate about Continuous Build, Integration, Test, and Delivery systems Familiarity With The Following Is a Strong Plus Tools: JIRA, Confluence, Github. TFS Experience or exposure to Agile/SCRUM is ideal Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 6-9 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300079 Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300075 Show more Show less

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies