Jobs
Interviews

1133 Dataflow Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

6 - 27 Lacs

Hyderabad, Telangana, India

On-site

About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)

Posted 1 month ago

Apply

3.0 years

6 - 27 Lacs

Pune, Maharashtra, India

On-site

About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)

Posted 1 month ago

Apply

1.0 years

4 - 7 Lacs

Delhi

On-site

Job Title: Nursing/Health Care Assistant Location: Oman Employment Type: Full-Time (rotational shifts, weekend availability) Salary: 250 to 300 OMR per month Reports To: RNs / LPNs / Nurse Manager Job Summary We are seeking a compassionate and dedicated Nursing/Health Care Assistant to support our nursing and rehabilitation team in delivering exceptional patient care. Under the supervision of RNs/LPNs, you will assist with daily living activities, monitor vital signs, maintain hygiene and safety, support therapy sessions, manage feeding and incontinence, perform light housekeeping, and assist with admissions, transfers, and transportation. Key Responsibilities 1. Personal Care & Activities of Daily Living Assist patients with bathing, grooming, dressing, toileting, and incontinence care. Support mobility: transfers, ambulation, positioning, turning to prevent bedsores, and range-of-motion exercises. Provide tube feeding and feeding assistance when necessary. 2. Observation & Monitoring Measure and record vital signs (BP, pulse, temperature, respiration) and intake/output per shift. Observe and document changes in behaviour, mood, physical condition, or signs of distress/aggression, and report promptly. Assist in restraining patients as per rehabilitation protocols. 3. Therapeutic Support Aid physiotherapists and participate in group or individual therapy sessions. Escort patients in emergency and non-emergency situations within the facility or to outpatient (OPD) appointments and events. 4. Medical & Equipment Care Support light medical tasks under supervision (e.g., non‑sterile dressings, routine equipment/supply care). Perform inventory checks and ensure medical supplies/equipment are organized and functional. 5. Environment & Safety Ensure patient rooms are clean and hygienic: change linens, sanitize equipment, tidy rooms. Maintain infection control, follow health & safety protocols, and supervise patients to prevent falls or harm. 6. Admissions, Transfers & Documentation Assist with patient admissions, transfers, and discharges. Accurately record care activities, observations, vitals, feeding, and output in patient charts. 7. Emotional & Companionship Support Provide compassionate companionship, basic patient education, and emotional support. Qualifications & Skills ANM diploma (2‑year) or CNA/Healthcare Assistant certification. 1–3 years minimum healthcare or GNM/BSc or relevant qualification; 3+ years preferred. CPR/BLS certification advantageous. Valid Dataflow clearance (for international candidates). Strong interpersonal, communication, empathy, and confidentiality skills. Physically able to lift up to ~50 lbs, stand for long periods, and perform patient transfers. Working Hours & Benefits Schedule : Rotational shifts; weekend availability. Benefits : Free Joining Ticket (Will be reimbursed after the 3 months’ Probation period) 30 Days paid Annual leave after 1 year of service completion Yearly Up and Down Air Ticket Medical Insurance Life Insurance Accommodation (Chargeable up to OMR 20/-) Note: Interested candidates please call us at 97699 11050 or 99302 65888 , or email your CV to recruitment@thegrowthhive.org . Job Type: Full-time Pay: ₹40,000.00 - ₹60,000.00 per month Benefits: Food provided Health insurance Provident Fund Schedule: Monday to Friday Rotational shift Weekend availability Work Location: In person

Posted 1 month ago

Apply

15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Lead Engineer Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing – Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases – Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud – GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles – Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality – experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage

Posted 1 month ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

What you’ll do With moderate supervision, manage project's progress, metadata collection, development and management. Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader. Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards. Enter/maintain information in documentation repository. Follow established security protocols, identify and report potential vulnerabilities. Perform intermediate level data quality checks, following established procedures. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 2+ years of experience as a data engineer or related role Cloud certification strongly preferred Intermediate skills using programming languages - Python, SQL (Big Query) or scripting languages Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience supporting the design and implementation of basic data models Demonstrates proficient Git usage and contributes to team repositories What could set you apart Master's Degree Experience with GCP (Cloud certification strongly preferred) Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc. Experience with AI or Machine Learning Experience with Data Visualisation Tools such as Tableau or Looker Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What you’ll do With moderate supervision, manage project's progress, metadata collection, development and management. Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader. Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards. Enter/maintain information in documentation repository. Follow established security protocols, identify and report potential vulnerabilities. Perform intermediate level data quality checks, following established procedures. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 2+ years of experience as a data engineer or related role Cloud certification strongly preferred Intermediate skills using programming languages - Python, SQL (Big Query) or scripting languages Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience supporting the design and implementation of basic data models Demonstrates proficient Git usage and contributes to team repositories What could set you apart Master's Degree Experience with GCP (Cloud certification strongly preferred) Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc. Experience with AI or Machine Learning Experience with Data Visualisation Tools such as Tableau or Looker Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Implement data integration solutions to ingest, process, and store large volumes of structured and unstructured data from various sources. Optimize and tune data pipelines for performance, reliability, and cost-efficiency. Ensure data quality and integrity through data validation, cleansing, and transformation processes. Develop and maintain data models, schemas, and metadata to support data analytics and reporting. Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to data workflows. Stay up-to-date with the latest GCP technologies and best practices, and provide recommendations for continuous improvement. Mentor and guide junior data engineers, fostering a culture of knowledge sharing and collaboration. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 4 to 6 years of experience in data engineering, with a strong focus on GCP. Proficiency in GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN. Strong programming skills in Python, PLSQL. Experience with SQL and NoSQL databases. Knowledge of data warehousing concepts and best practices. Familiarity with data integration tools and frameworks. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work in a fast-paced, dynamic environment.

Posted 1 month ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization. Who we are looking for: Primary Responsibilities: Key Responsibilities Architecture & Design: Design and implement comprehensive data architectures using AWS or GCP services Develop data models, schemas, and integration patterns for structured and unstructured data Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines Implement data governance frameworks and ensure compliance with security standards Design disaster recovery and business continuity strategies for data systems Technical Leadership: Lead cross-functional teams in implementing data solutions and migrations Provide technical guidance on cloud data services selection and optimization Collaborate with stakeholders to translate business requirements into technical solutions Drive adoption of cloud-native data technologies and modern data practices Platform Implementation: Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.) Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery) Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub) Implement automated data quality monitoring and validation processes Establish CI/CD pipelines for data infrastructure deployment Performance & Optimization: Monitor and optimize data pipeline performance and cost efficiency Implement data partitioning, indexing, and compression strategies Conduct capacity planning and scaling recommendations Troubleshoot complex data processing issues and performance bottlenecks Establish monitoring, alerting, and logging for data systems Skill: Bachelor’s degree in Computer Science, Data Engineering, or related field 9+ years of experience in data architecture and engineering 5+ years of hands-on experience with AWS or GCP data services Experience with large-scale data processing and analytics platforms AWS Redshift, S3, Glue, EMR, Kinesis, Lambda AWS Data Pipeline, Step Functions, CloudFormation BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub GCP Cloud Functions, Cloud Composer, Deployment Manager IAM, VPC, and security configurations SQL and NoSQL databases Big data technologies (Spark, Hadoop, Kafka) Programming languages (Python, Java, SQL) Data modeling and ETL/ELT processes Infrastructure as Code (Terraform, CloudFormation) Container technologies (Docker, Kubernetes) Data warehousing concepts and dimensional modeling Experience with modern data architecture patterns Real-time and batch data processing architectures Data governance, lineage, and quality frameworks Business intelligence and visualization tools Machine learning pipeline integration Strong communication and presentation abilities Leadership and team collaboration skills Problem-solving and analytical thinking Customer-focused mindset with business acumen Preferred Qualifications: Master’s degree in relevant field Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer) Experience with multiple cloud platforms Knowledge of data privacy regulations (GDPR, CCPA) Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description The Security Platform Engineering team, EPEO is looking for a passionate, experienced DevOps Engineer who is excited to foray into any technology and create innovative products & services. The interested candidate should have experience in designing and implementing frontend technologies, constructing backend APIs, database, setting up infrastructure in cloud and automating repeatable tasks. Also, the engineer should be a team player working with the team of developers from conception to the final product stage. Responsibilities YOUR TYPICAL DAY HERE WOULD BE: Design/Develop APIs using Java or Python and deploy using GCP Services Design, build, and maintain robust and scalable data pipelines to ingest, process, and transform data from various sources using GCP Services Contribute to the design and architecture of our data infrastructure and Automate data pipeline deployment and management Create websites using Angular, CSS, Hugo, JavaScript/TypeScript Automating repeatable tasks, workflows to improve efficiency of processes. Design, build, observability dashboards using Dynatrace, Grafana, Looker etc. Qualifications WHAT YOUR SKILLSET LOOKS LIKE: A relevant Bachelor's or Master’s Degree in computer science / engineering 3+ Experience in developing RESTful endpoints (Python or Java), websites and deploying using GCP Services Proficiency in using GCP services, including Cloud Run, BigQuery, Dataflow, and Google Cloud Storage (GCS). Experience working in DevOps or Agile development team Deep understanding of SRE concepts, including monitoring, alerting, automation, and incident management WOULD BE GREAT IF YOU ALSO BRING: GCP Certification

Posted 1 month ago

Apply

3.0 - 4.0 years

0 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Responsible for performing system development work around ETL, which can include both the development of new function and facilities, and the on-going systems support of live systems. Responsible for the documentation, coding, and maintenance of new and existing Extract, Transform, and Load (ETL) processes within the Enterprise Data Warehouse. Investigate live systems faults, diagnose problems, and propose and provide solutions. Work closely with various teams to design, build, test, deploy and maintain insightful MI reports. Support System Acceptance Testing, System Integration and Regression Testing. Identify any issues that may arise to delivery risk, formulate preventive actions or corrective measures, and timely escalate major project risks & issues to service owner. Execute test cases and log defects. Should be proactive in understanding the existing system, identifying areas for improvement, and taking ownership of assigned tasks. Ability to work independently with minimal supervision while ensuring timely delivery of tasks. Requirements To be successful in this role, you should meet the following requirements: 3-4 years of experience in Data Warehousing specialized in ETL. Given the current team is highly technical in nature, the expectation is that the candidate has experience in technologies like DataStage, Teradata Vantage, Unix Scripting and scheduling using Control-M and DevOps tools. Candidate should possess good knowledge on SQL and demonstrate the ability to write efficient and optimized queries effectively. Hands on experience or knowledge on GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Cloud spanner, Cloud SQL would be an added advantage. Hands-on experience with Unix, Git and Jenkins and would be added advantage. This individual should be able to develop and implement solutions on both on-prem and google cloud platform (GCP). Conducting migration, where necessary, to bring tools and other elements into the cloud and software upgrades. Should have proficiency in using JIRA and Confluence and experienced in working in projects that have followed Agile methodologies. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 1 month ago

Apply

4.0 years

2 - 5 Lacs

Hyderābād

On-site

About this role: Wells Fargo is seeking a Senior Cloud Platform engineer IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Python skills. In this role, you will: Understanding of Cloud Platform Technologies (GCP preferred) in the big data and data warehousing space (BigQuery, Dataproc, Dataflow, Data Catalog, Cloud Composer/Airflow, GKE/Anthos). Hands-on experience in IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Self-healing mechanisms, Load-balancing, API Gateway. In-depth knowledge of Cloud tools/solutions such as Cloud Pub/Sub, GKE, IAM, Scalability, Fault-tolerant design, Availability, BCP. Ability to quickly learn and adapt to the new cloud platforms / technologies Strong development experience in Python Extensive experience in working with Python API based solution design and integration" Required Qualifications, International: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Bachelors or Masters Degree in Comp. Science or equivalent Desired Qualifications: GCP DevOps, Terraform and K8s Certification Posting End Date: 2 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Job Description Key Responsibilities Design, build, and maintain scalable and secure relational and cloud-based database systems. Migrate data from spreadsheets or third-party sources into databases (PostgreSQL, MySQL, BigQuery). Create and maintain automated workflows and scripts for reliable, consistent data ingestion. Optimize query performance and indexing to improve data retrieval efficiency. Implement access controls, encryption, and data security best practices to ensure compliance. Monitor database health and troubleshoot issues proactively using appropriate tools. Collaborate with full-stack developers and data researchers to align data architecture with application needs. Uphold data quality through validation rules, constraints, and referential integrity checks. Keep up-to-date with emerging technologies and propose improvements to data workflows. Leverage tools like Python (Pandas, SQLAlchemy, PyDrive), and version control (Git). Support Agile development practices and CI/CD pipelines where applicable. Required Skills And Experience Strong SQL skills and understanding of database design principles (normalization, indexing, relational integrity). Experience with relational databases such as PostgreSQL or MySQL. Working knowledge of Python, including data manipulation and scripting (e.g., using Pandas, SQLAlchemy). Experience with data migration and ETL processes, including integrating data from spreadsheets or external sources. Understanding of data security best practices, including access control, encryption, and compliance. Ability to write and maintain import workflows and scripts to automate data ingestion and transformation. Experience with cloud-based databases, such as Google BigQuery or AWS RDS. Familiarity with cloud services (e.g., AWS Lambda, GCP Dataflow) and serverless data processing. Exposure to data warehousing tools like Snowflake or Redshift. Experience using monitoring tools such as Prometheus, Grafana, or the ELK Stack. Good analytical and problem-solving skills, with strong attention to detail. Team collaboration skills, especially with developers and analysts, and ability to work independently. Proficiency with version control systems (e.g., Git). Strong communication skills — written and verbal. Preferred / Nice-to-Have Skills Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience working with APIs for data ingestion and third-party system integration. Familiarity with CI/CD pipelines (e.g., GitHub Actions, Jenkins). Python experience using modules such as gspread, PyDrive, PySpark, or object-oriented design patterns. Experience in Agile/Scrum teams or working with product development cycles. Experience using Tableau and Tableau Prep for data visualization and transformation. Why Join Us Monthly long weekends — every third Friday off Wellness reimbursement to support your health and balance Paid parental leave Remote-first with flexibility and trust Work with a world-class data and marketing team inside a globally recognized brand Qualifications 5+ Years exp in Database Engineering. Additional Information Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves

Posted 1 month ago

Apply

3.0 years

3 - 10 Lacs

Vadodara

On-site

M3J Technical Services is seeking a Data Integration & Reporting Analyst adept at automating reports, extracting and cleansing data, and crafting impactful visualizations for KPI’s using tools like Power BI and Excel. You'll develop data-driven applications for desktop, web, and mobile platforms, ensuring our business remains agile and poised for growth. If you're passionate about leveraging data to drive strategic solutions, join our team! Local candidates based in Vadodara, Gujarat preferred. Responsibilities: Collaborate with stakeholders to design and publish Power BI reports aligned with business goals. Analyze and understand business processes to develop reports tailored to specific operational needs. Prepare and transform data from sources such as SQL Server, Excel, and SharePoint using Microsoft Fabric tools, including Dataflow Gen 2, Power Query, Lakehouse, and other related tools. Develop data models and optimize report performance, including row-level security. Maintain clear documentation and provide user training and support for Power BI. Actively contribute to process improvement initiatives by leveraging the Microsoft Power Platform (Power Apps, Power Automate, SharePoint) to enhance data collection and workflow automation. Qualifications: Bachelor’s degree in Computer Science, Industrial Engineering, Data Science, or Related field; or equivalent work experience. Solid understanding of BI concepts and data visualization best practices. 3+ years of hands-on experience with Power BI development. Strong skills in DAX, Power Query (M), and data modeling. Proficient in SQL and working with relational databases. 5+ years of working experience with Excel and Power Query. Experience with Fabric and other data integration tools. High attention to detail and the ability to work independently. Strong analytical and organizational skills. Excellent written and verbal communication skills. Results-oriented, proactive, and possessing a high level of integrity. Microsoft Certified: Power BI Data Analyst Associate is a plus Preferred Qualifications: Experience with Power BI Service administration (fabric, dataflows, workspaces, security, and dataset refresh). Familiarity with Microsoft Fabric, Power Automate, or SharePoint. Able to work independently and take initiative to improve data collection and reporting using modern tools and best practices. Language Requirement: Fluent in English Schedule: Monday to Friday Working Hours: 8am to 5pm Central US Time, or must be available to work at least 4 hours of the day during 8am to 4pm US Central Time Zone. Work Location: Vadodara, Gujarat, India (Preference will be given to candidates located in Vadodara, Gujarat.) Job Types: Full-time, Permanent Benefits: Flexible schedule Paid sick time Paid time off Schedule: Monday to Friday Supplemental Pay: Yearly bonus Application Question(s): Please share with us your desired salary. Have you implemented reuse of dataflows across multiple reports or workspaces? Would you be open to presenting a Power BI report you developed professionally — and explaining your approach to data connection, transformation, and solving performance or business logic challenges? Work Location: In person

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

GCP Data Engineer (5+ Years Exp) | Hybrid - Hyderabad Location: Hyderabad, India Experience: 5+ Years We are looking for an experienced GCP Data Engineer to join our growing data team. If you're passionate about building scalable data pipelines, optimizing workflows, and working with modern cloud-native tech, we want to hear from you! 🚀 Key Responsibilities: Design, develop, and maintain robust data pipelines on Google Cloud Platform (GCP) Work with structured and unstructured data to support analytics, ML, and reporting use cases Integrate data sources using Python , APIs, and GCP-native services (BigQuery, Dataflow, Pub/Sub, etc.) Implement data quality and governance practices using DBT or Collibra Collaborate cross-functionally with data analysts, data scientists, and business stakeholders 🛠️ Must-Have Skills: 5+ years of experience in data engineering or related roles Strong proficiency in GCP data services (BigQuery, Cloud Storage, Dataflow, Composer, etc.) Excellent Python programming skills, especially for ETL development Hands-on experience with DBT or Collibra Strong SQL skills and familiarity with relational and cloud-native databases Solid understanding of data modeling, pipeline orchestration, and performance tuning ✅ Good to Have: Experience with CI/CD pipelines and version control (Git) Knowledge of data security and compliance in cloud environments Familiarity with Agile methodologies 💼 What We Offer: Competitive compensation Flexible hybrid working model Opportunity to work on cutting-edge cloud data projects Collaborative and growth-focused culture 📩 Interested? Apply directly via LinkedIn or send your resume to [sasidhar.m@technogenindia.com].

Posted 1 month ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a skilled Data Engineer with over 6+ years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Key Responsibilities: Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer, Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Required Qualifications: 6+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

General Skills & Experience: Minimum 10-18 yrs of Experience • Expertise in Spark (Scala/Python), Kafka, and cloud-native big data services (GCP, AWS, Azure) for ETL, batch, and stream processing. • Deep knowledge of cloud platforms (AWS, Azure, GCP), including certification (preferred). • Experience designing and managing advanced data warehousing and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse). • Proven experience with building, managing, and optimizing ETL/ELT pipelines and data workflows for large-scale systems. • Strong experience with data lakes, storage formats (Parquet, ORC, Delta, Iceberg), and data movement strategies (cloud and hybrid). • Advanced knowledge of data modeling, SQL development, data partitioning, optimization, and database administration. • Solid understanding and experience with Master Data Management (MDM) solutions and reference data frameworks. • Proficient in implementing Data Lineage, Data Cataloging, and Data Governance solutions (e.g., AWS Glue Data Catalog, Azure Purview). • Familiar with data privacy, data security, compliance regulations (GDPR, CCPA, HIPAA, etc.), and best practices for enterprise data protection. • Experience with data integration tools and technologies (e.g. AWS Glue, GCP Dataflow , Apache Nifi/Airflow, etc.). • Expertise in batch and real-time data processing architectures; familiarity with event-driven, microservices, and message-driven patterns. • Hands-on experience in Data Analytics, BI & visualization tools (PowerBI, Tableau, Looker, Qlik, etc.) and supporting complex reporting use-cases. • Demonstrated capability with data modernization projects: migrations from legacy/on-prem systems to cloud-native architectures. • Experience with data quality frameworks, monitoring, and observability (data validation, metrics, lineage, health checks). • Background in working with structured, semi-structured, unstructured, temporal, and time series data at large scale. • Familiarity with Data Science and ML pipeline integration (DevOps/MLOps, model monitoring, and deployment practices). • Experience defining and managing enterprise metadata strategies.

Posted 1 month ago

Apply

4.0 years

0 Lacs

India

Remote

We are looking for a Google Cloud Data Engineer who will help us build a highly scalable and reliable platform to match our exponential growth. As a Google Cloud Data Engineer, you will be responsible for building a solid back end infrastructure which will enable data delivery in near real-time using next-gen technologies. Title : Google Cloud Data Engineer Location : Remote Work Employment Type : Full Time Work Timings : 2PM to 11 PM No of Openings : 3 We are looking for a candidate with Google Cloud Data Engineering experience only who can join us within 15 days or less. Applications not meeting this requirement will not be considered. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP). Implement data processing solutions using GCP services such as BigQuery, Dataflow, Data Proc, Pub/Sub, and Cloud Storage. Optimize data processing and storage for performance, cost, and scalability. Ensure data quality and integrity by implementing best practices for data governance and monitoring. Develop and maintain documentation for data pipelines, architectures, and processes. Stay up-to-date with the latest advancements in data engineering and GCP technologies. Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer with a focus on Google Cloud Platform (GCP). Proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage , and others. Strong programming skills in Python, Java, or similar languages. Experience with SQL and relational databases. Familiarity with data modeling, ETL processes, and data warehousing concepts. Knowledge of best practices in data security and privacy. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills : Google Cloud Data Engineer certification. About Techolution: Techolution is a leading innovation consulting company on track to become one of the most admired brands in the world for "innovation done right". Our purpose is to harness our expertise in novel technologies to deliver more profits for our enterprise clients while helping them deliver a better human experience for the communities they serve. With that, we are now fully committed to helping our clients build the enterprise of tomorrow by making the leap from Lab Grade AI to Real World AI. In 2019, we won the prestigious Inc. 500 Fastest-Growing Companies in America award, only 4 years after its formation. In 2022, Techolution was honored with the “Best-in-Business” title by Inc. for “Innovation Done Right”. Most recently, we received the “AIConics” trophy for being the Top AI Solution Provider of the Year at the AI Summit in New York. Let’s give you more insights! Some videos you wanna watch! Life at Techolution GoogleNext 2023 Ai4 - Artificial Intelligence Conferences 2023 WaWa - Solving Food Wastage Saving lives - Brooklyn Hospital Innovation Done Right on Google Cloud Techolution featured on Worldwide Business with KathyIreland Techolution presented by ION World’s Greatest Visit us @ www.techolution.com : To know more about our revolutionary core practices and getting to know in detail about how we enrich the human experience with technology.

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

170.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Area(s) of responsibility About Birlasoft Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. About the Job – Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios. Job Title – GCP BigQuery Engineer Location: Pune/Bangalore/Mumbai Educational Background – BE/Btech Key Responsibilities – Must Have Skills Should have 4-8 Years of Exp Data Quality Management (DQM) Specialist - Quick Base Location:** Mumbai, Pune, Bangalore We are seeking a highly skilled and experienced Data Quality Management (DQM) Specialist with expertise in Quick Base to join our team. Hands on experience in implementing and managing data quality initiatives, with a strong focus on utilizing Quick Base as a data quality management tool. As a DQM Specialist, you will be responsible for developing, implementing, and maintaining data quality processes and standards using Quick Base to ensure the accuracy, completeness, and consistency of our organization's data assets Experience with data modeling, schema design, and optimization techniques in BigQuery. Hands-on experience with GCP services such as Cloud Dataflow, Cloud Storage, Data Transfer Service, and Data Studio.

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes

Posted 1 month ago

Apply

3.0 years

0 Lacs

India

On-site

The client is looking to hire Nursing & Healthcare Assistants for their Rehabilitation Centre in Oman. Candidates willing to relocate to Oman may apply. Below are the vacancies we have for this project: ✓ Nursing Assistants Qualification: GNM (or) BSc – Nursing Experience: 3 years Positive dataflow reports are Mandatory ✓ Healthcare Assistant Qualification: GNM (or) BSc – Nursing Experience: 3 years Positive dataflow reports are Mandatory Other benefits: • Free Joining Ticket (Will be reimbursed after the 3 months Probation period) • 30 Days paid Annual leave after 1 year of service completion • Yearly Up and Down Air Ticket • Medical Insurance • Life Insurance • Accommodation (Charged at a nominal fee)

Posted 1 month ago

Apply

3.0 years

0 Lacs

Vadodara, Gujarat

On-site

M3J Technical Services is seeking a Data Integration & Reporting Analyst adept at automating reports, extracting and cleansing data, and crafting impactful visualizations for KPI’s using tools like Power BI and Excel. You'll develop data-driven applications for desktop, web, and mobile platforms, ensuring our business remains agile and poised for growth. If you're passionate about leveraging data to drive strategic solutions, join our team! Local candidates based in Vadodara, Gujarat preferred. Responsibilities: Collaborate with stakeholders to design and publish Power BI reports aligned with business goals. Analyze and understand business processes to develop reports tailored to specific operational needs. Prepare and transform data from sources such as SQL Server, Excel, and SharePoint using Microsoft Fabric tools, including Dataflow Gen 2, Power Query, Lakehouse, and other related tools. Develop data models and optimize report performance, including row-level security. Maintain clear documentation and provide user training and support for Power BI. Actively contribute to process improvement initiatives by leveraging the Microsoft Power Platform (Power Apps, Power Automate, SharePoint) to enhance data collection and workflow automation. Qualifications: Bachelor’s degree in Computer Science, Industrial Engineering, Data Science, or Related field; or equivalent work experience. Solid understanding of BI concepts and data visualization best practices. 3+ years of hands-on experience with Power BI development. Strong skills in DAX, Power Query (M), and data modeling. Proficient in SQL and working with relational databases. 5+ years of working experience with Excel and Power Query. Experience with Fabric and other data integration tools. High attention to detail and the ability to work independently. Strong analytical and organizational skills. Excellent written and verbal communication skills. Results-oriented, proactive, and possessing a high level of integrity. Microsoft Certified: Power BI Data Analyst Associate is a plus Preferred Qualifications: Experience with Power BI Service administration (fabric, dataflows, workspaces, security, and dataset refresh). Familiarity with Microsoft Fabric, Power Automate, or SharePoint. Able to work independently and take initiative to improve data collection and reporting using modern tools and best practices. Language Requirement: Fluent in English Schedule: Monday to Friday Working Hours: 8am to 5pm Central US Time, or must be available to work at least 4 hours of the day during 8am to 4pm US Central Time Zone. Work Location: Vadodara, Gujarat, India (Preference will be given to candidates located in Vadodara, Gujarat.) Job Types: Full-time, Permanent Benefits: Flexible schedule Paid sick time Paid time off Schedule: Monday to Friday Supplemental Pay: Yearly bonus Application Question(s): Please share with us your desired salary. Have you implemented reuse of dataflows across multiple reports or workspaces? Would you be open to presenting a Power BI report you developed professionally — and explaining your approach to data connection, transformation, and solving performance or business logic challenges? Work Location: In person

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Hiring Locations: Chennai, Trivandrum, Kochi Experience Range: 3 to 6 years Role Description The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels. Key Responsibilities Development & Engineering: Design, code, test, and implement scalable and efficient data pipelines. Develop features in accordance with requirements and low-level design. Write optimized, clean code using Python, PySpark, SQL, and ETL tools. Conduct unit testing and validate data integrity. Maintain comprehensive documentation of work. Monitoring & Support Monitor dashboards, pipelines, and databases across assigned shifts. Identify, escalate, and resolve anomalies using defined SOPs. Collaborate with L2/L3 teams to ensure timely issue resolution. Analyze trends and anomalies using SQL and Excel. Process Adherence & Contribution Follow configuration and release management processes. Participate in estimation, knowledge sharing, and defect management. Adhere to SLA and compliance standards. Contribute to internal documentation and knowledge bases. Mandatory Skills Strong command of SQL for data querying and analysis. Proficiency in Python or PySpark for data manipulation. Experience in ETL tools (any of the following): Informatica, Talend, Apache Airflow, AWS Glue, Azure ADF, GCP DataProc/DataFlow. Experience working with cloud platforms (AWS, Azure, or GCP). Hands-on experience with data validation and performance tuning. Working knowledge of data schemas and data modeling. Good To Have Skills Certification in Azure, AWS, or GCP (foundational or associate level). Familiarity with monitoring tools and dashboard platforms. Understanding of data warehouse concepts. Exposure to BigQuery, ADLS, or similar services. Soft Skills Excellent written and verbal communication in English. Strong attention to detail and analytical skills. Ability to work in a 24x7 shift model, including night shifts. Ability to follow SOPs precisely and escalate issues appropriately. Self-motivated with minimal supervision. Team player with good interpersonal skills. Outcomes Expected Timely and error-free code delivery. Consistent adherence to engineering processes and release cycles. Documented and trackable issue handling with minimal escalations. Certification and training compliance. High availability and uptime of monitored pipelines and dashboards. Skills Sql,Data Analysis,Ms Excel,Dashboards

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies