Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
4 - 7 Lacs
Thiruvananthapuram
On-site
Trivandrum India Technology Full time 6/23/2025 J00167993 Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking a Software Engineering Lead to spearhead innovative batch and data product development using Java and Google Cloud Platform. This exciting role offers the opportunity to lead cutting-edge projects, mentor a team of developers, and drive the implementation of high-impact data solutions. As a key player in Equifax's technology team, you'll leverage your expertise to shape the future of data engineering while working with the latest cloud and AI technologies. The position combines technical expertise with leadership skills to drive the development and deployment of advanced batch and data products. Key Responsibilities: Oversee the design, development, and deployment of innovative batch and data products Provide technical direction for Java and GCP implementations Lead and mentor a team of developers and engineers Collaborate with cross-functional teams to translate requirements into technical solutions Implement rigorous testing strategies and optimize performance Maintain documentation and ensure compliance with standards What experience you need: Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field Proficiency in Java and its ecosystems Extensive experience with Google Cloud Platform, including GKE, Cloud Storage, Dataflow, BigQuery Minimum of 7 years in software development, focusing on batch processing and data solutions Exceptional communication and teamwork skills Experience with securely handling sensitive data (PII / PHI) Proven track record of writing defect-free code At least 3 years in a lead role What could set you apart Ability to convey complex technical concepts to non-technical stakeholders Relevant certifications (e.g., Google Cloud Professional Developer, Professional Data Engineer) Experience with multiple cloud platforms (e.g., AWS, Azure) in addition to GCP Proficiency in both Java and Python for versatile development and capabilities Experience in optimizing performance and reducing costs in cloud environments We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 1 month ago
4.0 - 7.0 years
24 Lacs
India
Remote
Vacancy with a company focused on digital transformation, specializing in intelligent automation, digitalization, data science & analytics, and mobile enablement. They help businesses improve cost efficiency, productivity, and agility by reducing turnaround time and errors. The company provides services and solutions including operations digital transformation consulting, next-gen shared services setup consulting, cognitive RPA deployment, and AI-enabled CX enhancement. Founded in 2020 ;with HQ in Gurugram, India; the Company is now operating from Noida, Mumbai, Hyderabad, and Bengaluru as well. Job Role: Bigdata, GCP Years Of Experience 4 to 7 Years The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 4-5 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to Production application is a must and Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics) Job Types: Full-time, Permanent Pay: Up to ₹2,464,248.21 per year Benefits: Cell phone reimbursement Internet reimbursement Life insurance Paid sick time Paid time off Work from home Work Location: In person
Posted 1 month ago
10.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Experience: 10 to 15 years Location: Bengaluru, Gurgaon, Pune About Us: AceNet Consulting is a fast-growing global business and technology consulting firm specializing in business strategy, digital transformation, technology consulting, product development, start-up advisory and fund-raising services to our global clients across banking & financial services, healthcare, supply chain & logistics, consumer retail, manufacturing, eGovernance and other industry sectors. We are looking for hungry, highly skilled and motivated individuals to join our dynamic team. If you’re passionate about technology and thrive in a fast-paced environment, we want to hear from you. Job Summary : The data architect is responsible for designing, creating, and managing an organization’s data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, structured, accessible, secure, and aligned with business objectives. Key Responsibilities: *Interact & Influence business stakeholders to secure strong engagement and ensures that the data & analytical product delivery aligns with longer-term strategic roadmaps. *Design & contribute towards the structure and layout of lake house architecture optimizing data storage, and establishing data access controls and security measures. *Implement the long-term Data & Analytics strategy and deliver functional objectives. *Assess requirement feasibility, translates high-level business requirements into data requirements, appropriate metadata, test data, and data quality standards. *Explore Data Sources by working with Application owners to confirm datasets to be extracted. *Contribute to establishing and implementing database structure, including schema design, table definitions, column specifications, and naming conventions. *Design Data models for Source data products, Master data products & Insight data products. *Document Data Architecture artifacts for different Data Products and solutions and perform peer review across various functions. *Support Data Engineering and BI Engineering teams during the build phase. *Review Data models development, validate and provide deployment approval. *Work closely with data stewards and governance functions to continuously improve data quality and enhance the reliability of data model(s). *Simplify the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company. *Leads and participates in the peer review and quality assurance of project architectural artifacts across the Data Architecture group through governance forums. *Collaborate and contribute to the development and enhancement of standards, guidelines, and best practices within Data Architecture discipline. *Works with Product owners, Business stewards, Data Scientists and end users to understand data consumers’ needs and develop data products/ data solutions. *Evaluates and recommends emerging technologies for data management, storage, and analytics. Role Requirements and Qualifications: *A bachelor’s degree in computer science, data science, engineering, or related field. *At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives. *Translate business requirements and ability to guide solution design & architecture in developing Data Products. *Develop scalable, high-performance, and reusable data models that can be efficiently utilized across different data initiatives and help in generating actionable insights. *Work collaboratively with data stewardship and governance functions to continuously improve data quality and enhance the reliability of data models. *Ability to navigate and collaborate with cross-functional teams involving data scientists, business analysts, and stakeholders. *Strong Business Process and Functional understanding with an Analytical background. *CPG experience with knowledge in domain specific concepts is a plus. *Knowledge on Agile methodologies with experience working on tools such as Jira & Confluence. *Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake). *Experience with database technologies such as SQL, NoSQL, Snowflake, HANA. *Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques. *Experience building enterprise data models (Logical, Physical, Conceptual) and data modeling tool experience a plus (ERWIN, ER/STUDIO, etc.) *Strong Business Process and SAP functional understanding with an analytics background (preferred SAP ECC/S4, BW, HANA, BI, ARIBA experience) is a plus. *Expert-level SQL skills. *Experience with enterprise scale data engineering orchestration frameworks/ELT tools and common data engineering Python libraries (dbt, pandas, great expectations, etc.) is a plus. *Experience with business intelligence tools and technologies such as Power BI & Tableau. *Strong analytical and problem-solving skills. *Understanding of Data Governance principles and practices including Data Quality, Data Security and compliance. *Ability to think strategically on the use of data within the Organization that support both current and future needs. *Excellent communication and interpersonal skills for stakeholder management and cross-functional collaboration. Why Join Us: *Opportunities to work on transformative projects, cutting-edge technology and innovative solutions with leading global firms across industry sectors. *Continuous investment in employee growth and professional development with a strong focus on up & re-skilling. *Competitive compensation & benefits, ESOPs and international assignments. *Supportive environment with healthy work-life balance and a focus on employee well-being. *Open culture that values diverse perspectives, encourages transparent communication and rewards contributions.
Posted 1 month ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Immediate Joiner Kindly share your CV here: joinus@amussoft.com Qualifications Expertise in Data Architecture and Data Modeling 10+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Excellent communication and collaboration skills.
Posted 1 month ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Reporting to the A/NZ DSE Chapter Manager India PEC within Decision Sciences & Engineering, this role will own and be responsible for the data & analytic engineering chapter in India PEC. The Data Engineer is an essential part of the business that enables the team to support the ongoing acquisition and internal purposing of data, through to the fulfilment of products, insights and systems. As a Data Engineer, you will be responsible for working with our internal customers to ensure that data and systems are being designed and built to move and manipulate data in a scalable, reusable and efficient manner to suit the environment, project, security and requirements. What You’ll Do Design, architect, and implement scalable and secure data pipelines on GCP, utilizing services like Dataflow, Pub/Sub, and Cloud Storage. Develop and maintain data models, ensuring data quality, consistency, and accessibility for various internal stakeholders. Automate data processes and workflows using scripting languages like Python, leveraging technologies like Spark and Airflow. Monitor and troubleshoot data pipelines, identifying and resolving performance issues proactively. Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions. Implement data governance best practices, including data security, access control, and lineage tracking. Lead security initiatives, design and implement security architecture. Lead data quality initiatives, design and implement monitoring dashboards. Mentor and guide junior data engineers, sharing knowledge and best practices to foster a high-performing team. Role requires a solid educational foundation and the ability to develop a strategic vision and roadmap for D&A’s transition to the cloud while balancing delivery of near-term results that are aligned with execution. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 8+ years of experience as a data engineer or related role, with experience demonstrating leadership capabilities Cloud certification strongly preferred Expert level skills using programming languages such as Python or SQL (Big Query) and advanced level experience with scripting languages. Demonstrated proficiency in all Google Cloud Services Experience building and maintaining complex data pipelines, troubleshooting complex issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects; Proficiency in Airflow strongly desired Experience designing and implementing advanced to complex data models and experience enabling advanced optimization to improve performance Experience leading a team with Git expertise strongly preferred Hands on Experience on Agile Methodoligies Working Knowledge of CI/CD What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities A well-balanced view of resource management, thinking creatively and effectively to deploy the team whilst building skills for the future Skilled in internal networking, negotiating and proactively developing individuals and teams to be the best they can be Strong communicator & presenter, bringing everyone on the journey. Knowledge of Big Data technology and tools with the ability to share ideas among a collaborative team and drive the team based on technical expertise and learning, sharing best practices Excellent communication skills to engage with senior management, internal customers and product management Sound understanding of regulations and security requirements governing access to data in Big Data systems Sound understanding of Insight delivery systems for batch and online Should be able to run Agile Scrum-based projects Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: GCP Data Engineer Location: Chennai (Hybrid Work Mode) Experience: 5+ Years with relevant 4 years in GCP Budget - 30+ LPA Joining: Immediate Joiners Preferred – July 1st or 2nd Week Employment Type: Full-Time Client Type: Product and Service-Based Clients Job Description: We are looking for an experienced GCP Data Engineer to join our team for an engagement with a leading product and service-based client. The position is based in Chennai with a hybrid work model (3 days WFO). The ideal candidate should be able to join by the first or second week of July . Key Responsibilities: Build, manage, and optimize robust data pipelines using GCP tools such as BigQuery, Dataflow, Composer, and Pub/Sub Design scalable and efficient data models to support analytics and reporting Develop data engineering solutions using SQL and Python Implement workflow orchestration using Airflow or similar tools Work closely with DevOps for CI/CD integration , version control, and cloud-native deployment Ensure data integrity, security, and performance across distributed systems Required Skills: Hands-on experience with GCP services : BigQuery, Dataflow, Composer, Pub/Sub Strong proficiency in SQL , Python , and data modeling Experience with Airflow or similar orchestration tools Familiarity with CI/CD pipelines , Git , and cloud-native architectures Nice to Have: Exposure to Terraform , Looker , or Kafka Experience with Docker and DevOps practices Background in MLOps or working with ML pipelines Educational Qualifications: Must have completed B.E / B.Tech If the candidate holds a Postgraduate degree (PG) , it should be in regular (full-time) mode only Additional Information: Location: Chennai (Hybrid – 3 days WFO) Client Type: Product and Service-Based Notice Period: Immediate joiners preferred (able to join by July 1st or 2nd week) If you're passionate about building cloud-based data solutions and meet the above criteria, we invite you to apply and be part of a forward-thinking team. 📩 Interested? Drop your resume at gomathi@reveilletechnologies.com 📢 Tag your connections or share this post if you know someone who fits! hashtag#DataEngineer hashtag#GCP hashtag#BigQuery hashtag#CloudJobs hashtag#Python hashtag#ETL hashtag#Dataflow hashtag#HiringNow
Posted 1 month ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greeting from Getronics! We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Location. Position Description: Data Analytics team is seeking a GCP Data Engineer to create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics. Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform cloud . Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of Google Cloud Platform and 3rd party technologies for deploying on Google Cloud Platform cloud. Skills Required: Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Experience Required: 6+ Years of experience in Data Engineering and minimum 3+ years of Google Cloud Platform is a must. Education Required : Any Bachelor's degree (preferably Engineering Graduate) Additional Information: Willing to work in Hybrid mode (3 days a week) in Chennai - Client location. Looking for Immediate to 30 days' notice candidates only. Candidate should be willing to attend GCP coding assessment (1-hour online video coding) as 1st level of interview. Interested candidates, please share your resume to abirami.rsk@getronics.com
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Python, Spark, Data Engineer, Cloudera, Onpremise, Azure, Snlowfow, Kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools.
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Data Engineer, Spark, Scala, Python, Onpremise, Cloudera, Snowflake, kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title : Senior Data Engineer Location : Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools.
Posted 1 month ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking a Software Engineering Lead to spearhead innovative batch and data product development using Java and Google Cloud Platform. This exciting role offers the opportunity to lead cutting-edge projects, mentor a team of developers, and drive the implementation of high-impact data solutions. As a key player in Equifax's technology team, you'll leverage your expertise to shape the future of data engineering while working with the latest cloud and AI technologies. The position combines technical expertise with leadership skills to drive the development and deployment of advanced batch and data products. Key Responsibilities Oversee the design, development, and deployment of innovative batch and data products Provide technical direction for Java and GCP implementations Lead and mentor a team of developers and engineers Collaborate with cross-functional teams to translate requirements into technical solutions Implement rigorous testing strategies and optimize performance Maintain documentation and ensure compliance with standards What Experience You Need Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field Proficiency in Java and its ecosystems Extensive experience with Google Cloud Platform, including GKE, Cloud Storage, Dataflow, BigQuery Minimum of 7 years in software development, focusing on batch processing and data solutions Exceptional communication and teamwork skills Experience with securely handling sensitive data (PII / PHI) Proven track record of writing defect-free code At least 3 years in a lead role What could set you apart Ability to convey complex technical concepts to non-technical stakeholders Relevant certifications (e.g., Google Cloud Professional Developer, Professional Data Engineer) Experience with multiple cloud platforms (e.g., AWS, Azure) in addition to GCP Proficiency in both Java and Python for versatile development and capabilities Experience in optimizing performance and reducing costs in cloud environments We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 1 month ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the Role: We are seeking a talented and experienced GCP Data Engineer with 5–8 years of overall experience in data engineering and strong hands-on expertise with Google Cloud Platform services. The ideal candidate will be responsible for building, optimizing, and maintaining scalable, robust, and secure data pipelines that support advanced analytics and business intelligence initiatives. Key Responsibilities: Design, implement, and maintain scalable data pipelines and ETL/ELT workflows using GCP services like Dataflow , BigQuery , Pub/Sub , and Cloud Functions . Build and manage real-time and batch data processing pipelines with an emphasis on reliability, security, and performance. Leverage BigQuery for high-performance analytical queries and transformations. Implement streaming and event-driven architectures using Pub/Sub and Dataflow . Integrate data pipelines with third-party APIs, cloud storage, and internal systems. Collaborate with product teams, data analysts, and ML engineers to align data engineering solutions with business goals. Ensure data quality , governance , security , and compliance across the platform. Participate in code reviews, design sessions, and technical planning. Create and maintain clear technical documentation and operational guides. Required Skills & Experience: 5 to 8 years of experience in data engineering , with at least 2+ years of hands-on GCP experience . Proficient with key GCP data services : BigQuery Dataflow (Apache Beam) Pub/Sub Cloud Functions Strong coding skills in Python or Java , and advanced SQL skills. Experience with REST APIs , data ingestion , streaming , and batch processing . Strong understanding of data warehousing , data lake architecture , and data modeling principles . Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and DevOps practices . Exposure to Docker and cloud deployment workflows. Experience working with PostgreSQL , MongoDB , or other relational/NoSQL databases.
Posted 1 month ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking a Software Engineering Lead to spearhead innovative batch and data product development using Java and Google Cloud Platform. This exciting role offers the opportunity to lead cutting-edge projects, mentor a team of developers, and drive the implementation of high-impact data solutions. As a key player in Equifax's technology team, you'll leverage your expertise to shape the future of data engineering while working with the latest cloud and AI technologies. The position combines technical expertise with leadership skills to drive the development and deployment of advanced batch and data products. Key Responsibilities: Oversee the design, development, and deployment of innovative batch and data products Provide technical direction for Java and GCP implementations Lead and mentor a team of developers and engineers Collaborate with cross-functional teams to translate requirements into technical solutions Implement rigorous testing strategies and optimize performance Maintain documentation and ensure compliance with standards What experience you need: Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field Proficiency in Java and its ecosystems Extensive experience with Google Cloud Platform, including GKE, Cloud Storage, Dataflow, BigQuery Minimum of 7 years in software development, focusing on batch processing and data solutions Exceptional communication and teamwork skills Experience with securely handling sensitive data (PII / PHI) Proven track record of writing defect-free code At least 3 years in a lead role What could set you apart Ability to convey complex technical concepts to non-technical stakeholders Relevant certifications (e.g., Google Cloud Professional Developer, Professional Data Engineer) Experience with multiple cloud platforms (e.g., AWS, Azure) in addition to GCP Proficiency in both Java and Python for versatile development and capabilities Experience in optimizing performance and reducing costs in cloud environments
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred Technical And Professional Experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.
Posted 1 month ago
0 years
5 - 9 Lacs
Hyderābād
On-site
Req ID: 327059 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Pyspark Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Strong hands-on experience in designing and building data pipelines using Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Cloud Composer. Proficient in Python for data processing, scripting, and automation in cloud and distributed environments. Solid working knowledge of Apache Spark / PySpark, with experience in large-scale data transformation and performance tuning. Familiar with CI/CD processes, version control (Git), and workflow orchestration tools such as Airflow or Composer. Ability to work independently in fast-paced Agile environments with strong problem-solving and communication skills. Exposure to modern data architectures and real-time/streaming data solutions is an added advantage. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 1 month ago
5.0 years
4 - 7 Lacs
Thiruvananthapuram
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 month ago
5.0 years
5 - 9 Lacs
Thiruvananthapuram
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale RPA/AI applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream RPA/AI, Python 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing UiPath RPA solutions using Dataflow/Apache Beam, SQL Server,BigQuery, PubSub, GCS, and others Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 month ago
5.0 years
4 - 7 Lacs
Thiruvananthapuram
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us Transcloud is a cloud technology services company that helps businesses adopt the cloud to empower them for the future. Job Description We are seeking a skilled and experienced Cloud Engineer to join our dynamic team. As a Cloud Engineer, you will play a crucial role in implementing and managing cloud architectures for our client’s software applications. Your strong expertise in Google Cloud Platform (GCP) implementations, programming languages, and cloud ecosystem design will contribute to the success of our cloud-based solutions. We are offering a highly competitive salary commensurate with industry standards. Minimum Qualifications: - Demonstrated experience in implementing cloud architecture for software applications. - Extensive expertise in GCP implementations. - Proficiency in at least one of the programming/scripting languages. - Proficient in using Linux CLI commands and Google Cloud SDK. - Ability to design holistic cloud ecosystems with a focus on Google Cloud Platform capabilities and features. - Familiarity with Cloud Shell and GCP commands such as gcloud and gsutil. - Hands-on experience with Kubernetes, DevOps, developing and managing CI/CD pipelines. - Hands-on experience with GCP IaaS services such as GCE, GAE, GKE, VPC, DNS, Interconnect VPN, CDN, Cloud Storage, FileStore, Firebase, Deployment Manager, and Stackdriver. - Familiarity with GCP services including Cloud Endpoints, Dataflow, Dataproc, Datalab, Dataprep, Cloud Composer, Pub/Sub, and Cloud Functions. Responsibilities: - Troubleshoot issues, actively seeking out problems, and providing effective solutions. - Implementing HA and DR solutions - Be an active participant in the running of the team, fostering a great place to work. - Engage with the wider business to identify opportunities for future work for the team. - Experiment with new technologies to help push the boundaries of what the team is building. Requirements - Professional certifications related to cloud platforms, specifically Google Cloud Platform. - Knowledge of containerization technologies (e.g., Docker, Kubernetes). - Familiarity with DevOps practices and tools. - Understanding basic network and security principles in cloud environments. - Experience with automation and infrastructure-as-code tools, preferably terraform - Experience with other cloud platforms such as AWS or Azure is a good to have. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Benefits ✔ Health Insurance for a worry-free lifestyle. ✔ Flexible work hours for better work-life balance. ✔ Informal dress code to express your individuality. ✔ Enjoy a 5-day work week to pursue your passions outside of work. ✔ Exposure to work directly with clients and grow up in the career. Apply here: https://wetranscloud.zohorecruit.in/jobs/Careers/77118000003921192/Senior-Cloud-Engineer-GCP
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.
Posted 1 month ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description About GlobalLogic – With a workforce of 30000+ employees, GlobalLogic is a leader in digital product engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise — we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the communications, financial services, automotive, healthcare and life sciences, technology, media and entertainment, manufacturing, and semiconductor industries. GlobalLogic is a Hitachi Group Company. Feel free to watch other videos on GlobalLogic India YouTube channel to understand what makes GlobalLogic truly exceptional! Requirements Leadership & Strategy Lead and mentor a team of cloud engineers, providing technical guidance and career development support Define cloud architecture standards and best practices across the organization Collaborate with senior leadership to develop cloud strategy and roadmap aligned with business objectives Drive technical decision-making for complex cloud infrastructure projects Establish and maintain cloud governance frameworks and operational procedures Leadership Experience 3+ years in technical leadership roles managing engineering teams Proven track record of successfully delivering large-scale cloud transformation projects Experience with budget management and resource planning Strong presentation and communication skills for executive-level reporting Certifications (Preferred) Google Cloud Professional Cloud Architect Google Cloud Professional Data Engineer Additional relevant cloud or security certifications Technical Excellence 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services Architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services Lead the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services Design complex integrations with multiple data sources and systems, ensuring optimal performance and scalability Implement and enforce security best practices and compliance frameworks across all cloud environments Troubleshoot and resolve complex technical issues while establishing preventive measures Job responsibilities Technical Skills Programming Languages: Expert-level proficiency in Python with experience in additional languages (Java, Go, or Scala preferred) GCP Services: Deep expertise with comprehensive GCP service portfolio including Dataflow, Compute Engine, BigQuery, Cloud Functions, Cloud Run, Pub/Sub, GCS, IAM, VPC, and emerging services Containerization & Orchestration: Advanced knowledge of Docker, Kubernetes (GKE), and container orchestration patterns Cloud Security: Extensive experience implementing enterprise security frameworks, compliance standards (SOC2, GDPR, HIPAA), and zero-trust architectures Infrastructure as Code: Proficiency with Terraform, Cloud Deployment Manager, or similar tools CI/CD: Experience with advanced deployment pipelines and GitOps practices Cross-functional Collaboration Partner with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions Lead cross-functional project teams and coordinate deliverables across multiple stakeholders Present technical recommendations and project updates to executive leadership Establish relationships with GCP technical account managers and solution architects What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title: Data Analysis Location: pan India Experience: 9-12 yrs Job Type : Contract to hire. Notice Period: Immediate joiners. Mandatory Skills: CRM Analytics Salesforce ETL techniques TCRM UX design with advanced skills in binding, SAQL and JSON. Job Description: • Develop data recipes/dataflows in CRM Analytics tool to support business requirements. • Help establish best practices and standards for dataflow, data recipe, dashboard, and report/lens development and deployment. • Perform data analysis, produce data samples/prototypes and produce ad-hoc reports. Obtain and analyze data by accessing multiple sources, including ADL, EDW and Salesforce. • Partner and collaborate with IT, platform owners and BI Analysts to provide input to project timelines, scope and risks for transformational commission initiatives. • Developing a clear understanding of system landscape, data repositories and its interrelationships. • Develop reports and processes to continuously monitor data quality and integrity issues and problems. • Locate and define new process improvement opportunities. • Perform extensive data unit testing and quality assurance (QA). Troubleshoot issues and work cross-functionally towards a solution. • Respond and resolve all assigned feedbacks (all priorities) within the set thresholds. • Provide guidance/governance on best practices and approaches to design, implement and sustain effective data driven analytics solutions To succeed in this role, you’ll need the following: • Minimum of 8+ years in Salesforce and CRM Analytics having successfully implemented at least 1 TCRM solution. • Experience working with sales, service and marketing data is highly preferred. • Technical expertise building data models and ETL techniques (Data integration using Sync, recipes and dataflows). • Expert in building Salesforce CRM Analytics Dashboard/Reporting capabilities and UX design with advanced skills in binding, SAQL and JSON. • Experience with building Einstein Discovery models and deploying/embedding the model to Sales Cloud page layouts. • Familiarity with Einstein AI components like Next Best Action, Prediction Builder, and Insights nice to have. • Experience with CRM Analytics standard/templated apps. • Experience working in an Agile/Scrum environment. • Strong verbal & written communication skills. Self-starter personality who can operate with minimal supervision • Experience in problem solving, critical thinking and priority setting. Certifications: • CRM Analytics & Einstein Discovery Certified (Must) • Salesforce Admin Certified (Preferred).
Posted 1 month ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: • Design and implement scalable, secure, and cost-effective data architectures using GCP. • Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. • Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. • Ensure data architecture aligns with business goals, governance, and compliance requirements. • Collaborate with stakeholders to define data strategy and roadmap. • Design and deploy BigQuery solutions for optimized performance and cost efficiency. • Build and maintain ETL/ELT pipelines for large-scale data processing. • Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. • Implement best practices for data security, privacy, and compliance in cloud environments. • Integrate machine learning workflows with data pipelines and analytics tools. • Define data governance frameworks and manage data lineage. • Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. • Optimize cloud infrastructure for scalability, performance, and reliability. • Mentor junior team members and ensure adherence to architectural standards. • Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). • Ensure high availability and disaster recovery solutions are built into data systems. • Conduct technical reviews, audits, and performance tuning for data solutions. • Design solutions for multi-region and multi-cloud data architecture. • Stay updated on emerging technologies and trends in data engineering and GCP. • Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : • Google Cloud Certification is Preferred. Primary Skills : • 7+ years of experience in data architecture, with at least 3 years in GCP environments. • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. • Strong experience in data warehousing, data lakes, and real-time data pipelines. • Proficiency in SQL, Python, or other data processing languages. • Experience with cloud security, data governance, and compliance frameworks. • Strong problem-solving skills and ability to architect solutions for complex data environments. • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. • Leadership experience and ability to mentor technical teams. • Excellent communication and collaboration skills
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough