Home
Jobs

607 Dataflow Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree or four or more years of work experience. Four or more years of relevant work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in Big Data technologies - GCP/Hadoop/Spark/Composer/DataFlow/Bigquery. Experience in complex SQL. Experience working on Streaming ETL pipelines Expertise in Java Experience with MemoryStore / Redis / Spanner Experience in troubleshooting the data issues. Experience with data pipeline and workflow management & Governance tools. Knowledge of Information Systems and their applications to data management processes. Even better if you have one or more of the following… Three or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Experience in driving a small team of 2 or more members for technical delivery #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 week ago

Apply

175.0 years

0 Lacs

Chennai

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. We are seeking a visionary and highly motivated Vice President of Engineering to lead our Big Data Application Services organization. In this pivotal role, you will guide the engineering, solutions, and central replatforming teams, driving the transformation of our data engineering capabilities and the evolution toward our next gen big data platform. The ideal candidate is a strategic and innovative leader with a strong engineering background, deep expertise in big data technologies-especially on Google Cloud Platform (GCP), and a passion for advancing data observability, replatform acceleration, and cutting-edge data engineering solutions. How will you make an impact in this role? Define and execute a strategic roadmap for the Big Data Application Services, aligning with overall business objective and leading the transition to next-generation big-data platform Lead, mentor, and grow high-performing engineering, solutions and replatforming teams. Foster a culture of innovation, collaboration, engineering excellence, and continuous learning Drive the architecture, design and implementation of scalable, reliable and optimized data engineering capabilities and practices on Google Cloud Platform Champion best practices in software engineering, data engineering and operations within the big data ecosystem. Implement comprehensive data observability strategies and tools to ensure data quality, reliability and performance across the platform. Oversee the development and deployment of advanced data engineering and transformation capabilities, enabling sophisticated data analysis, business intelligence and machine learning use cases. Partner with stakeholders to define and deliver enterprise-wide impactful data solutions. Work closely with product management, federated data teams, data science, analytics teams, and other business units to understand data needs and deliver solutions that drive business value. Work with peers, Principal Engineers and Principal Architects to assimilate emerging trends and technologies in the big data, cloud, and analytics landscape. Evaluate and introduce new tools and technologies to enhance platform capabilities and drive innovation. Excellent leadership and interpersonal skills, with the ability to influence at all levels across functions, from both technical and non-technical perspectives alike; able to lead business and technology conversations with SVP and/or EVP level business leaders Qualifications 15+ years of large-scale technology delivery and formal management in a complex environment and/or comparable experience. With at least 7 years of experience in Big Data Technology delivery. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Demonstrated ability to lead, inspire, and manage multi-disciplinary engineering teams in a fast-paced, global environment. Deep understanding and hands-on experience in designing and implementing secure, scalable, and cost effective GCP Big Data Services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Composer Airflow, Cloud Storage) Strong knowledge of data observability principles, frameworks, and tools. Experience in implementing solutions for data monitoring, logging, tracing, lineage, and quality alerting. Proven experience leading large-scale data platform modernization and replatforming initiatives. Experienced with replatforming tools and accelerators to streamline migration and reduce risk. Expertise in designing and implementing robust and scalable ETL/ELT pipelines and data transformation. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

JD for Google Pre-Sales Solution Architect (Data & AI) - Lead the technical discovery process, assess customer requirements, and design scalable solutions leveraging a comprehensive suite of Data & AI services, including BigQuery, Dataflow, Vertex AI, Generative AI solutions, and advanced AI/ML services like Vertex AI, Gemini, and Agent Builder. Architect and demonstrate solutions leveraging generative AI, large language models (LLMs), AI agents, and agentic AI patterns to automate workflows, enhance decision-making, and create intelligent applications. Develop and deliver compelling product demonstrations, proofs-of-concept (POCs), and technical workshops that showcase the value and capabilities of Google Cloud. Strong understanding of data warehousing, data lakes, streaming analytics, and machine learning pipelines. Collaborate with sales to build strong client relationships, articulate the business value of Google Cloud solutions, and drive adoption. Lead and Contribute technical content and architectural designs for RFI/RFP responses and technical proposals leveraging Google Cloud Services. Stay informed of industry trends, competitive offerings, and new Google Cloud product releases, particularly in the infrastructure and data/AI domains. Extensive experience in architecting & designing solutions on Google Cloud Platform, with a strong focus on: Data & AI services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI (ML Ops, custom models, pre-trained APIs), Generative AI (e.g., Gemini). Strong understanding of cloud architecture patterns, DevOps practices, and modern software development methodologies. Ability to work effectively in a cross-functional team environment with sales, product, and engineering teams. 5+ years of experience in pre-sales or solutions architecture, focused on cloud Data & AI platforms. Skilled in client engagements, technical presentations, and proposal development. Excellent written and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

About The Role We are seeking experienced Data Analysts to join our growing team. The ideal candidate will have a strong background in data analysis, complex SQL queries, and experience working within large-scale Data Warehouse environments. Familiarity with cloud technologies such as GCP or AWS is mandatory, and prior exposure to AWS EMR and Apache Airflow is highly desirable. ________________________________________ Key Responsibilities Perform deep data analysis to support business decision-making, reporting, and strategic initiatives. Write and optimize complex SQL queries for data extraction, transformation, and reporting across large, distributed datasets. Work extensively within data warehouse environments to design, test, and deliver data solutions. Collaborate with data engineers, business analysts, and stakeholders to understand requirements and translate them into technical deliverables. Analyze large, complex datasets to identify trends, patterns, and opportunities for business growth. Develop, maintain, and optimize ETL/ELT pipelines; familiarity with Apache Airflow for workflow orchestration is a plus. Work with cloud-native tools on GCP or AWS to manage and analyze data effectively. Support the development of data quality standards and ensure data integrity across all reporting platforms. Document data models, queries, processes, and workflows for knowledge sharing and scalability.____________________________________ Required Skills & Experience Minimum 7 years of professional experience in Data Analysis. Strong, demonstrable expertise in SQL, including writing, debugging, and optimizing complex queries. Solid experience working within a Data Warehouse environment (e.g., BigQuery, Redshift, Snowflake, etc.). Hands-on experience with GCP (BigQuery, Dataflow) or AWS (Redshift, Athena, S3, EMR). Knowledge of data modeling concepts, best practices, and data architecture principles. Understanding of ETL processes and tools; hands-on experience with Apache Airflow is a strong plus. Strong analytical thinking, attention to detail, and problem-solving skills. Ability to work in a fast-paced environment and manage multiple priorities. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures Of Outcomes Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control And Review Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement Gathering And Analysis Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation And Thought Leadership Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments Snowflake Architect Key Responsibilities: Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms). Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance. Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies. Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency. Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks). Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements. Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization. Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution. Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake. Key Skills: Deep understanding of Snowflake's advanced features and architecture. Strong data warehousing concepts and data modeling expertise. Solution architecture and system design skills. Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates. Expertise in performance tuning principles and techniques at an architectural level. Strong understanding of data security principles and implementation patterns. Knowledge of various data integration patterns (ETL, ELT, Streaming). Excellent communication and presentation skills to articulate designs to technical and non-technical audiences. Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team. Skills Snowflake,Data modeling,Cloud platforms,Solution architecture Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

About the Job : At Techjays, we are driving the future of artificial intelligence with a bold mission to empower businesses worldwide by helping them build AI solutions that transform industries. As an established leader in the AI space, we combine deep expertise with a collaborative, agile approach to deliver impactful technology that drives meaningful change. Our global team consists of professionals who have honed their skills at leading companies such as Google, Akamai, NetApp, ADP, Cognizant Consulting, and Capgemini. With engineering teams across the globe, we deliver tailored AI software and services to clients ranging from startups to large-scale enterprises. Be part of a company that’s pushing the boundaries of digital transformation. At Techjays, you’ll work on exciting projects that redefine industries, innovate with the latest technologies, and contribute to solutions that make a real-world impact. Join us on our journey to shape the future with AI. We’re looking for a skilled AI Implementation Engineer with a strong JavaScript/TypeScript background to help us build scalable AI-powered systems, with a particular focus on Retrieval-Augmented Generation (RAG) and LLM integrations. You will play a key role in developing intelligent applications that combine vector search, natural language processing, and LLM-driven reasoning, delivering real-time AI experiences to end users. You’ll work with full-stack engineers, AI researchers, and data teams to create seamless interfaces between front-end applications, back-end services, and AI models. We are seeking a Data Analytics Engineer to design, develop, and optimize data pipelines and analytical dashboards that drive key business decisions. The ideal candidate will have hands-on experience working with BI tools like Power BI and Tableau and a strong background in building scalable data pipelines in AWS, GCP, or Azure cloud environments.\ Experience Range: 3 to 6 years Primary skills : Power BI, Tableau, SQL, Data Modeling, Data Warehousing, ETL/ELT Pipelines, AWS Glue, AWS Redshift, GCP BigQuery, Azure Data Factory, Cloud Data Pipelines, DAX, Data Visualization, Dashboard Development Secondary Skills : Python, dbt, Apache Airflow, Git, CI/CD, DevOps for Data, Snowflake, Azure Synapse, Data Governance, Data Lineage, Apache Beam, Data Catalogs, Basic Machine Learning Concepts Work Location : Remote Key Responsibilities: Develop and maintain scalable, robust ETL/ELT data pipelines across structured and semi-structured data sources. Collaborate with data scientists, analysts, and business stakeholders to identify data requirements and transform them into efficient data models. Design and deliver interactive dashboards and reports using Power BI and Tableau. Implement data quality checks, lineage tracking, and monitoring solutions to ensure high reliability of data pipelines. Optimize SQL queries and BI reports for performance and scalability. Work with cloud-native tools in AWS (e.g., Glue, Redshift, S3), GCP (e.g., BigQuery, Dataflow), or Azure (e.g., Data Factory, Synapse). Automate data integration and visualization workflows. Required Qualifications: Bachelor's or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in data engineering or data analytics roles. Proven experience with Power BI and Tableau – including dashboard design, DAX, calculated fields, and data blending. Proficiency in SQL and experience in data modeling and relational database design. Hands-on experience with data pipelines and orchestration using tools like Airflow, dbt, Apache Beam, or native cloud tools. Experience working with one or more cloud platforms – AWS, GCP, or Azure. Strong understanding of data warehousing concepts and tools such as Snowflake, BigQuery, Redshift, or Synapse. Preferred Skills: Experience with scripting in Python or Java for data processing. Familiarity with Git, CI/CD, and DevOps for data pipelines. Exposure to data governance, lineage, and catalog tools. Basic understanding of ML pipelines or advanced analytics is a plus. What We Offer: Competitive salary and benefits. Opportunity to work with modern cloud-native data stack. Collaborative, innovative, and data-driven work environment. Flexible working hours and remote work options. Show more Show less

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

🚀 We’re Hiring: Data Engineer – GCP | Chennai (3–5 Yrs Exp) Are you a passionate Data Engineer with hands-on experience in GCP, Java, and SQL ? We're looking for skilled professionals who can build scalable data solutions , implement Infra as Code , and collaborate with business teams to drive data-driven decisions . 📍 Location : Chennai (Candidates working from Chennai also welcome) 🧑‍💻 Experience : 3–5 years 📅 Notice Period : Immediate / Serving Notice / Max 30 Days 🔧 What You’ll Do: Develop code artifacts and manage with Git Design and implement data ingestion pipelines – streaming, relational, and file-based Work with Python, Java , and GCP services (including Astronomer ) Model data for performance and scalability Follow DevOps and CI/CD best practices (testing, automation, Infra as Code) Collaborate with stakeholders to define data models and backtrack to source systems ✅ Skills Required: GCP (BigQuery, Dataflow, etc.) Java, Python, SQL Power BI 💡 Nice to Have: API Integration Big Data Tools / Pipelines 🎓 Education : Bachelor's Degree (or equivalent) 📬 Interested or know someone? Share this Post or DM Let’s build something impactful together! #Hiring #ChennaiJobs #GCP #DataEngineer #ImmediateJoiners #Java #Python #InfraAsCode #DataIngestion #PowerBI Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the company has joined the New York Stock Exchange (NYSE: AZO) and earned a spot in the Fortune 500. AutoZone has been committed to providing the best parts, prices, and customer service in the automotive aftermarket industry. We have a rich culture and history of going the Extra Mile for our customers and our community. At AutoZone you’re not just doing a job; you’re playing a crucial role in creating a better experience for our customers, while creating opportunities to DRIVE YOUR CAREER almost anywhere! We are looking for talented people who are customer focused, enjoy helping others and have the DRIVE to excel in a fast-paced environment! Position Summary The Systems Engineer will design data model solutions and ensure alignment between business and IT strategies, operating models, guiding principles, and software development with a focus on the information layer. The Systems Engineer works across business lines and IT domains to ensure that information is viewed as a corporate asset. This includes its proper data definition, creation, usage, archival, and governance. The Systems Engineer works with other engineers and Data Architects to design overall solutions in accordance with industry best practices, principles and standards. The Systems Engineer strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market. Key Responsibilities Enhance and maintain the AutoZone information strategy. Ensure alignment of programs and projects with the strategic AZ Information Roadmap and related strategies Perform gap analysis between current data structures and target data structures. Enhance and maintain the Enterprise Information Model Work with service architects and application architects to assist with the creation of proper data access and utilization methods. Gather complex business requirements and translate product and project needs into data models supporting long-term solutions. Serve as a technical data strategy expert and lead the creation of technical requirements and design deliverables. Define and communicate data standards, industry best practices, technologies, and architectures. Check conformance to standards and resolve any conflicts by explaining and justifying architectural decisions. Recommend and evaluate new tools and methodologies as needed. Manage, communicate, and improve the data governance framework. Requirements: A systems thinker, able to move fluidly between high-level abstract thinking and detail-oriented implementation, open minded to new ideas, approaches, and technologies A data and fact-driven decision maker, with an ability to make quick decisions under uncertainty when necessary; able to quickly learn new technologies, tools, and organizational structures/strategies Understanding of current industry standard best practices regarding integration, architecture, tools, and processes A self-starter that is naturally inquisitive, requiring only small pieces to the puzzle, across many technologies – new and legacy Excellent written and verbal communication, presentation, and analytical skills, including the ability to effectively communicate complex technical concepts and designs to a broad range of people. Education and/or Experience Bachelor's degree in MIS, Computer Science or similar degree or experience required Minimum 3+ yrs experience and knowledge of database systems such as Oracle, Postgres, UDB/DB2, BigQuery, Spanner, JSON, and Couchbase Minimum 2 years of experience with data requirements gathering, acquisition of data from difference business systems, ingestion of data in GCP using managed services namely BigQuery, DataFlow, Composer, Pub/Sub and other ingestion technologies, curation of the data using DBT or other similar technologies and creating data marts/wide tables for analysis and reporting consumption. Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using GCP and SQL technologies Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues. Relational & NoSQL database design capability across OLTP & OLAP Excellent analytical and problem-solving skills Excellent verbal and written communication skills Ability to facilitate modeling sessions and communicate appropriately with IT and business customers Experience with Agile software development methodologies Experience with large-replicated databases across distributed and cloud data centers Our Values: An AutoZoner Always..... PUTS CUSTOMERS FIRST CARES ABOUT PEOPLE STRIVES FOR EXCEPTIONAL PERFORMANCE ENERGIZES OTHERS EMBRACES DIVERSITY HELPS TEAMS SUCCEED Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: 1. Design, develop, test, and maintain scalable ETL data pipelines using Python. 2. Perform data ingestion from various sources and apply transformation & cleansing logic to ensure high-quality data delivery. 3. Implement and enforce data quality checks, validation rules, and monitoring. 4. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. 5. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. 6 . Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL 7. Document pipeline designs, data flow diagrams, and operational support procedures. 8. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data process. Cloud Functions for lightweight serverless compute. BigQuery for data warehousing and analytics. Cloud Composer for orchestration of data workflows ( Apache Airflow.) Google Cloud Storage (GCS) for managing data at scale. IAM for access control and secure. Cloud Run for containerized applications Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Show more Show less

Posted 1 week ago

Apply

175.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. We are seeking a visionary and highly motivated Vice President of Engineering to lead our Big Data Application Services organization. In this pivotal role, you will guide the engineering, solutions, and central replatforming teams, driving the transformation of our data engineering capabilities and the evolution toward our next gen big data platform. The ideal candidate is a strategic and innovative leader with a strong engineering background, deep expertise in big data technologies-especially on Google Cloud Platform (GCP), and a passion for advancing data observability, replatform acceleration, and cutting-edge data engineering solutions. How will you make an impact in this role? Define and execute a strategic roadmap for the Big Data Application Services, aligning with overall business objective and leading the transition to next-generation big-data platform Lead, mentor, and grow high-performing engineering, solutions and replatforming teams. Foster a culture of innovation, collaboration, engineering excellence, and continuous learning Drive the architecture, design and implementation of scalable, reliable and optimized data engineering capabilities and practices on Google Cloud Platform Champion best practices in software engineering, data engineering and operations within the big data ecosystem. Implement comprehensive data observability strategies and tools to ensure data quality, reliability and performance across the platform. Oversee the development and deployment of advanced data engineering and transformation capabilities, enabling sophisticated data analysis, business intelligence and machine learning use cases. Partner with stakeholders to define and deliver enterprise-wide impactful data solutions. Work closely with product management, federated data teams, data science, analytics teams, and other business units to understand data needs and deliver solutions that drive business value. Work with peers, Principal Engineers and Principal Architects to assimilate emerging trends and technologies in the big data, cloud, and analytics landscape. Evaluate and introduce new tools and technologies to enhance platform capabilities and drive innovation. Excellent leadership and interpersonal skills, with the ability to influence at all levels across functions, from both technical and non-technical perspectives alike; able to lead business and technology conversations with SVP and/or EVP level business leaders Qualifications 15+ years of large-scale technology delivery and formal management in a complex environment and/or comparable experience. With at least 7 years of experience in Big Data Technology delivery. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Demonstrated ability to lead, inspire, and manage multi-disciplinary engineering teams in a fast-paced, global environment. Deep understanding and hands-on experience in designing and implementing secure, scalable, and cost effective GCP Big Data Services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Composer Airflow, Cloud Storage) Strong knowledge of data observability principles, frameworks, and tools. Experience in implementing solutions for data monitoring, logging, tracing, lineage, and quality alerting. Proven experience leading large-scale data platform modernization and replatforming initiatives. Experienced with replatforming tools and accelerators to streamline migration and reduce risk. Expertise in designing and implementing robust and scalable ETL/ELT pipelines and data transformation. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

2-4 years of experience using Microsoft SQL Server (version 2008 or later). Ability to create and maintain complex T-SQL queries, views, and stored procedures. 0 -1+ year experience performing advanced ETL development including various dataflow transformation tasks. Ability to monitor the performance and improve the performance by optimizing the code and by creating indexes. Proficient with Microsoft Access and Microsoft Excel Knowledge of descriptive statistical modeling methodologies and techniques such as classification, regression, and association activities to support statistical analysis in various healthcare data. Strong knowledge of Data Warehousing concepts Strong written, verbal and Customer service skills Proficiency in compiling data, creating reports and presenting information, including expertise with query, MS Excel and / or other such product like SSRS, Tableau, PowerBI, etc Proficiency on various data forms including but not limited to star and snowflake schemas. Ability to translate business needs into practical applications Desire to work within a fast-paced environment Ability to work in a team environment and be flexible in taking on various projects Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Kyndryl Data Science Bengaluru, Karnataka, India Chennai, Tamil Nadu, India Posted on Jun 9, 2025 Apply now Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As GCP Data Engineer at Kyndryl, you will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment using GCP data services. You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. Responsibilities Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and maintain Python / PySpark for data processing and integrate with GCP services for seamless data operations. Develop and optimize SQL queries for data analysis and reporting. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Implement data governance and security best practices within GCP. Perform data quality checks and validation to ensure accuracy and consistency. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Provide technical support and guidance to junior data engineers and other team members. Participate in code reviews and contribute to continuous improvement of data engineering practices. Implement best practices for cost management and resource utilization within GCP. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical And Professional Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field with over 8 years of experience in data engineering More than 3 years of experience with the GCP data ecosystem Hands-on experience and Strong proficiency in GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion. Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in PySpark and/or Python, specifically for building cloud-native data pipelines. Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. Knowledge of data governance, security, and compliance best practices. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Ability to work independently and in agile teams. Preferred Technical And Professional Experience GCP Data Engineer Certification is highly preferred. Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working as a Data Engineer and/or in cloud modernization. Knowledge of Databricks, Snowflake, for data analytics. Experience in NoSQL databases Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with BI dashboards and Google Data Studio is a plus. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address. Apply now See more open positions at Kyndryl Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Role - Machine Learning Engineer Required Skills & Experience 5+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting. Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts. Proven expertise in data blending, transformation, and feature engineering , adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data. Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable. Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R). Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision . Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch ). Familiarity with MLOps tools and practices , including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines. Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs. Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Interested candidates share cv to dikshith.nalapatla@motivitylabs.com

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Apply now » Apply Now Start applying with LinkedIn Please wait... Date: Jun 7, 2025 Location: Pune, MH, IN Company: HMH HMH is a learning technology company committed to delivering connected solutions that engage learners, empower educators and improve student outcomes. As a leading provider of K–12 core curriculum, supplemental and intervention solutions, and professional learning services, HMH partners with educators and school districts to uncover solutions that unlock students’ potential and extend teachers’ capabilities. HMH serves more than 50 million students and 4 million educators in 150 countries. HMH Technology India Pvt. Ltd. is our technology and innovation arm in India focused on developing novel products and solutions using cutting-edge technology to better serve our clients globally. HMH aims to help employees grow as people, and not just as professionals. For more information, visit www.hmhco.com The data architect is responsible for designing, creating, and managing an organization’s data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, accessible, secure, and aligned with business objectives. The data architect designs data models, warehouses, file systems and databases, and defines how data will be collected and organized. Responsibilities Interprets and delivers impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps Designs the structure and layout of data systems, including databases, warehouses, and lakes Selects and designs database management systems that meet the organization’s needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Defines and implements the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms Designs processes for the ETL process from various sources into the organization’s data systems Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps Simplifies the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company Leads and participates in the peer review and quality assurance of project architectural artifacts across the EA group through governance forums Defines and manages standards, guidelines, and processes to ensure data quality Works with IT teams, business analysts, and data analytics teams to understand data consumers’ needs and develop solutions Evaluates and recommends emerging technologies for data management, storage, and analytics Design, create, and implement logical and physical data models for both IT and business solutions to capture the structure, relationships, and constraints of relevant datasets Build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions Effectively collaborate and communicate with various stakeholders to understand data and business requirements and translate them into data models Create entity-relationship diagrams (ERDs), data flow diagrams, and other visualization tools to represent data models Collaborate with database administrators and software engineers to implement and maintain data models in databases, data warehouses, and data lakes Develop data modeling best practices, and use these standards to identify and resolve data modeling issues and conflicts Conduct performance tuning and optimization of data models for efficient data access and retrieval Incorporate core data management competencies, including data governance, data security and data quality Education Job Requirements A bachelor’s degree in computer science, data science, engineering, or related field Experience At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives Experience leading projects involving data warehousing, data modeling, and data analysis Design experience in Azure Databricks, PySpark, and Power BI/Tableau Skills Ability in programming languages such as Java, Python, and C/C++ Ability in data science languages/tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture Ability to assess traditional and modern data architecture components based on business needs Experience with business intelligence tools and technologies such as ETL, Power BI, and Tableau Ability to regularly learn and adopt new technology, especially in the ML/AI realm Strong analytical and problem-solving skills Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs Expert knowledge of data modeling concepts, methodologies, and best practices Proficiency in data modeling tools such as Erwin or ER/Studio Knowledge of relational databases and database design principles Familiarity with dimensional modeling and data warehousing concepts Strong SQL skills for data querying, manipulation, and optimization, and knowledge of other data science languages, including JavaScript, Python, and R Ability to collaborate with cross-functional teams and stakeholders to gather requirements and align on data models Excellent analytical and problem-solving skills to identify and resolve data modeling issues Strong communication and documentation skills to effectively convey complex data modeling concepts to technical and business stakeholders HMH Technology Private Limited is an Equal Opportunity Employer and considers applicants for all positions without regard to race, colour, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. We are committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit https://careers.hmhco.com/. Follow us on Twitter, Facebook, LinkedIn, and YouTube. Job Segment: Curriculum, Social Media, Education, Marketing Apply now » Apply Now Start applying with LinkedIn Please wait... Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description: Overall more than 5 Yrs of experience in Data projects Good Knowledge of GCP Bigquery SQL Python dataflow skills Has worked in Implementation projects building data pipelines transformation logics and data models Job Title GCP Data Engineer Belongs to Data Management Engineering Education Bachelor of engineering in any disciplineequivalent Desired Candidate Profile Technology Engineering Expertise 4 years of experience in implementing data solutions using GCP BigquerySQL programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Key Responsibilities Implement data solutions using GCP and need to be familiar in programming with SQLpython Ensure clarity on NFR and implement these requirements Work with Client Technical Manager by understanding customers landscape their IT priorities Lead performance engineering and capacity planning exercises for databases Technology Engineering Expertise 4 years of experience in implementing data pipelines for Data Analytics solutions Experience in solutions using Google Cloud Data Flow Apache Beam Java programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Experience with different development methodologies RUP Scrum XP Soft skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Skills: Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description · Hands-on experience with data tools and technologies is a must. ·Partial design experience is acceptable, but core focus should be on strong data skills. ·Will be supporting the pre-sales team from a hands-on technical perspective. · GCP experience: Looker / BigQuery / Vertex – any of these with 6 months to 1 year of experience. Requirements On day one we'll expect you to... Own the modules and take complete ownership of the project Understand the scope, design and business objective of the project and articulate it in the form of a design document Strong experience with Google Cloud Platform data services, including BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI (Gemini, Imagen, Veo) Experience in implementing data governance on GCP Familiarity with integrating GCP services with other platforms like Snowflake, and hands-on Snowflake project experience is a plus Experienced coder in python, SQL, ETL and orchestration tools Experience with containerized solutions using Google Kubernetes Engine Good communication skills to interact with internal teams and customers Expertise in pySpark(Batch and Real-time both), Kafka, SQL, Data Querying tools. Experience in working with a team, continuously monitoring, working as a individual contributor hand-on and helping team deliver their work as you deliver yours Experience in working with large volumes of data in distributed environment keeping in mind parallelism and concurrency, ensuring performant and resilient system ops Optimize the deployment architecture to reduce job run-times and resource utilization Develop and Optimize Data Warehouses given the schema design. Show more Show less

Posted 1 week ago

Apply

55.0 years

6 - 9 Lacs

Pune

Remote

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.

Posted 1 week ago

Apply

0 years

6 - 8 Lacs

Bengaluru

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: GCP Data Engineer Location: Chennai Duration: 12 Months Work Type: Onsite Position Description: Bachelor's Degree 2+Years in GCP Services- Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Skills Required: Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA Skills Preferred: Java Experience Required: 8+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Our company is seeking a seasoned Senior Data Engineer to join our dynamic team. In this role as a Senior Data Engineer, you will concentrate on data integration and ETL projects for cloud-based systems. Your primary responsibilities will be to design and implement sophisticated data solutions, ensuring data accuracy, reliability, and accessibility. Responsibilities Design and implement sophisticated data solutions for cloud-based systems Develop ETL workflows utilizing SQL, Python, and additional relevant technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Collaborate with cross-functional teams to gauge data integration needs and specifications Create and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Monitor and refine data integration processes to boost performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or related fields 5-8 years of experience in data engineering Experience using cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Proficiency in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good oral and written communication skills in English at a B2 level Nice to have Proficiency in ETL using Python Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : React.js, Cloud Network Operations Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement scalable applications using Google BigQuery. - Collaborate with cross-functional teams to ensure application functionality. - Conduct code reviews and provide technical guidance to junior developers. - Stay updated on industry trends and best practices in application development. - Troubleshoot and resolve application issues in a timely manner. Professional & Technical Skills: (Project specific) - BQ, BQ Geospatial, Python, Dataflow, Composer , - Secondary skill -Geospatial Domain Knowledge" - Must To Have Skills: Proficiency in Google BigQuery. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Bengaluru office. - A 15 years full-time education is required. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 week ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies