Home
Jobs
Companies
Resume

593 Dataflow Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives.The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills And Software Requirements Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system / service to another system / service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills CME Group: Where Futures are Made CME Group is the world’s leading derivatives marketplace. But who we are goes deeper than that. Here, you can impact markets worldwide. Transform industries. And build a career by shaping tomorrow. We invest in your success and you own it – all while working alongside a team of leading experts who inspire you in ways big and small. Problem solvers, difference makers, trailblazers. Those are our people. And we’re looking for more. At CME Group, we embrace our employees' unique experiences and skills to ensure that everyone’s perspectives are acknowledged and valued. As an equal-opportunity employer, we consider all potential employees without regard to any protected characteristic. Important Notice: Recruitment fraud is on the rise, with scammers using misleading promises of job offers and interviews to solicit money and personal information from job seekers. CME Group adheres to established procedures designed to maintain trust, confidence and security throughout our recruitment process. Learn more here. Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

Minimum qualifications: Bachelor’s degree or equivalent practical experience. 5 years of experience with software development in one or more programming languages, and with data structures/algorithms. 3 years of experience testing, maintaining, or launching software products. 1 year of experience with software design and architecture. 1 year of experience in generative AI and machine learning. 1 year of experience implementing core AI/ML concepts. Preferred qualifications: Master's degree or PhD in Computer Science, or a related technical field. 1 year of experience in a technical leadership role. Experience with Python, Notebooks, ML Frameworks (e.g., Tensorflow). Experience in large-scale data systems. About the job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google’s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward. In this role, you will be responsible for designing and developing next-generation software systems at the intersection of data analytics (data warehousing, business intelligence, spark, dataflow, data catalog, and more) and generative AI. You will work closely with our team of experts to research, explore and develop innovative solutions that will bring generative AI to the forefront of Google Cloud Platform (GCP) Data Analytics for our customers. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Write and test product or system development code. Collaborate with peers and stakeholders through design and code reviews to ensure best practices amongst available technologies (e.g., style guidelines, checking code in, accuracy, testability, and efficiency,) Contribute to existing documentation or educational content and adapt content based on product/program updates and user feedback. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on hardware, network, or service operations and quality. Design and implement solutions in one or more specialized Machine Learning (ML) areas, leverage ML infrastructure, and demonstrate experience in a chosen field. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 days ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? This role will be part of the Regulatory Reporting team , we are currently modernizing our platform , migrating it to GCP. You will contribute towards making the platform more resilient and secure for future regulatory requirements and ensuring compliance and adherence to Federal Regulations. Minimum Qualifications: - 5-8 years of overall technology experience - Strong expertise with handling large volumes of data coming from many different disparate systems - Strong expertise with Python and Py Spark - Working knowledge of Apache Spark , Airflow, GCP BQ and Data Proc open source data processing platforms - Working knowledge of databases and performance tuning for complex big data scenarios - Oracle DB and In Memory Processing - Cloud Deployments , CI/CD and Platform Resiliency - Strong experience with SRE practices , GIT Hub Automation , best practices around code coverage and documentation automation - Good experience with Mvel - Excellent communication skills , collaboration mindset and ability to work through unknowns Preferred Qualifications: - Understanding of Regulatory and Compliance Reports preferred - Experience with React, Node JS - Experience with GCP - Big Query and Data Flow , Data Migration to Bug Query and usage of CloudSQL We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 days ago

Apply

0 years

9 Lacs

Bengaluru

On-site

Associate - Production Support Engineer Job ID: R0388741 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-06-12 Location: Bangalore Position Overview Job Title: Associate - Production Support Engineer Location: Bangalore, India Role Description You will be operating within Corporate Bank Production as an Associate, Production Support Engineer in the Corporate Banking subdivisions. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery plus, using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of the run book. Drive knowledge management across the supported applications and ensure full compliance Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your skills and experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Unix, Shell Scripting and/or Python SQL Stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters ITIL v3 Certified (must) Control-M, CRON scheduling MQ- DBUS, IBM JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Data Streaming – Kafka (Experience with Confluent flavor a plus) and ZooKeeper Hadoop framework Configuration Mgmt Tooling: Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) APM Tooling: either or one of Splunk AppDynamics Geneos NewRelic Other platforms: Scheduling – Ctrl-M is a plus, Autosys, etc Search – Elastic Search and/or Solr+ is a plus Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Familiarity with TDD and/or BDD Distributed systems Experience on cloud platforms such as Azure, GCP is a plus Familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT IntelliJ SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in Follow the Sun model, virtual teams and in matrix structure Service Operations experience within a global operations context 6-9 yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of run-book execution Experience of supporting complex application and infrastructure domains Good analytical, troubleshooting and problem-solving skills Working knowledge of incident tracking tools (i.e., Remedy, Heat etc.) How we’ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Chennai

On-site

Senior Software Developer, Chennai, India. About the Job: Developers at Vendasta work in teams, working with Product Managers and Designers in the creation of new features and products. Our Research and Development department works hard to help developers learn, grow, and experiment while at work. With a group of over 100 developers, we have fostered an environment that provides developers with the opportunity to continuously learn from each other. The ideal candidate will demonstrate that they are bright and can tackle tough problems while being able to communicate their solution to others. They are creative and can mix technology with the customer’s problems to find the right solution. Lastly, they are driven and will motivate themselves and others to get things done. As an experienced Software Developer, we expect that you will grow into a thought leader at Vendasta, driving better results across our development organization. Your Impact: Develop software in teams of 3-5 developers, with the ability to take on tasks for the team and independently work on them to completion. Follow best practices to write clean, maintainable, scalable, and tested software. Contribute to the best engineering practices, including the use of design patterns, CI/CD, maintainable and scalable code, code review, and automated tests. Provide inputs for a technical roadmap for the Product Area. Ensure that the NFRs and technical debt get their due focus. Work collaboratively with Product Managers to design solutions (including technical roadmap) that help our Partners connect digital solutions to small and medium-sized businesses. Analyzing and improving current system integrations and migration strategies. Interact and collaborate with our high-quality technical team across India and Canada What You Bring to the Table: 8+ years experience in a related field with at least 3+ years as full stack developer in an architect or senior development role Experience or strong understanding of high scalability, data-intensive, distributed Internet applications Software development experience including building distributed, microservice-style and cloud-based application architectures Proficiency in modern software language, and willingness to quickly learn our technology stack Preference will be given to candidates with a strong Go (programming language) experience, and who can demonstrate the ability to build and adapt web applications using Angular. Experience in designing, Building and Implementing cloud-native architectures (GCP preferred). Experience working with the Scrum framework Technologies We Use: Cloud Native Computing using Google Cloud Platform BigQuery, Cloud Dataflow, Cloud Pub/Sub, Google Data Studio, Cloud IAM, Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Datastore, Google Maps Platform, Stackdriver, etc.… We have been invited to join the Early Access Program on quite a few GCP technologies. GoLang, Typescript, Python, JavaScript, HTML, Angular, GRPC, Kubernetes Elasticsearch, MySQL, PostgreSQL About Vendasta: So what do we do? We create an entire platform full of digital products & solutions that help small to medium-sized businesses (SMBs) have a stronger presence online through digital advertising, online listings, reputation management, website creation, social media marketing … and much more! Our platform is used exclusively by channel partners, who sell products and services to SMBs, allowing them to leverage us to scale and grow their business. We are trusted by 65,000+ channel partners, serving over 6 million SMBs worldwide! Perks: Stock options (as per policy) Benefits - Health insurance Paid time off Public transport reimbursement Flex days Training & Career Development - Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Provident Fund

Posted 2 days ago

Apply

3.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide. Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves. The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures. This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States. Responsibilities Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage). Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security. Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e.g. DOMO, Looker, Databricks) Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards. Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability. Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development. Requirements Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e.g. Mathematics, Statistics, Engineering). 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e.g. Professional Cloud Developer, Professional Cloud Database Engineer). Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows. Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion. Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer. View our privacy policy, including our privacy notice to California residents here: https://www.five9.com/pt-pt/legal. Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9. Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title:GCP Data Architect Work Mode: Hybrid Loc: Pune Experience: 10+ years Job Type: Contract to hire Notice Period: - Immediate joiners. Mandatory Skills: GCP,SQL Detailed JD A Google Cloud Platform (GCP) Data Architect with experience in migrating of legacy platform to Cloud SQL (preferred) would typically have a robust set of skills and expertise, including: Key Skills & Expertise: 1. **Cloud SQL Expertise**: - Deep knowledge of **Google Cloud SQL**, which is a fully managed relational database service that supports databases like MySQL, PostgreSQL, and SQL Server. - Experience in designing scalable, highly available, and secure database solutions on Cloud SQL. 2. **Migration Strategies**: - Strong experience in **data migration** from mainframe databases (like DB2 or IMS) to modern cloud-based relational databases. - Knowledge of **ETL tools** (e.g., **Google Cloud Dataflow**, **Apache Beam**) for extracting, transforming, and loading mainframe data to Cloud SQL. - Familiarity with **Database Migration Service** (DMS) in GCP for automated database migrations from legacy systems to Cloud SQL. 3. **Data Modeling**: - Ability to translate mainframe data structures (which may use COBOL or other legacy formats) to a relational schema that fits Cloud SQL’s SQL-based architecture. - Expertise in normalizing and optimizing data models for Cloud SQL environments. (must have) - Represent and seek approval of the Data Model in HSBC Data Architecture forum - Create and maintain the Data Dictionary 4. **Data Integration and Transformation**: - Proficiency in integrating data from different sources, ensuring data consistency and accuracy during migration. - Use of **Google Cloud Storage**, **BigQuery**, or other tools for intermediate data storage and analysis during migration. 5. **Cloud Architecture and Design**: - Architecting and designing highly available, fault-tolerant cloud infrastructure for running Cloud SQL databases. - Ensuring that the design can scale horizontally or vertically as needed and optimizing for cost-efficiency. 6. **Performance Tuning and Optimization**: - Experience with performance tuning, query optimization, and configuring Cloud SQL to handle high-throughput workloads. - Monitoring tools such as **Google Cloud Operations** suite (formerly Stackdriver) for real-time performance tracking. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Description We are seeking a talented and experienced professional for the position of Business Systems Management, Data Engineering (DX-DSS-DEN) at Google Cloud. In this role, you will be instrumental in driving business insights through advanced data engineering practices while managing complex systems to support our cloud operations. You will leverage Google Cloud technologies to ensure data integrity, optimize workflows, and develop scalable solutions that enhance our service offerings. Collaborating closely with cross-functional teams, you will drive initiatives that improve data accessibility and usability, transforming raw data into meaningful insights that influence strategic decisions. As part of a fast-paced environment, you will be responsible for the design, development, and implementation of robust data pipelines, extracting value from vast datasets. You will also focus on creating documentation and providing training to stakeholders, ensuring that our systems are user-friendly and aligned with business requirements. If you are passionate about data management and systems optimization, and you are looking to contribute to cutting-edge technology solutions at a global leader in cloud computing, we invite you to join us and make a significant impact. Responsibilities Design, develop, and maintain scalable data pipelines using Google Cloud technologies. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Ensure data quality and integrity by developing validation checks and monitoring processes. Implement best practices for data governance and security across all data engineering initiatives. Optimize existing data systems for improved performance and reduced costs. Create comprehensive documentation of data systems and processes for future reference and training. Provide technical support and training to stakeholders on data access and visualization Bachelor's degree in Computer Science, Data Engineering, or a related field. Proven experience in data engineering, with a focus on cloud platforms, particularly Google Cloud. Strong proficiency in SQL and experience with big data technologies such as Hadoop, BigQuery, or Dataflow. Solid understanding of data modeling, ETL processes, and data warehousing concepts. Experience with programming languages such as Python, Java, or Go. Familiarity with data visualization tools like Tableau or Data Studio. Excellent problem-solving skills and ability to work in a fast-paced, dynamic environment. (ref:hirist.tech) Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

About The Job Position Name - Senior Data & AI/ML Engineer – GCP Specialization Lead Minimum Experience - 10+ years Expected Date of Joining - Immediate Primary Skill GCP Services: BigQuery, Dataflow, Pub/Sub, Vertex AI ML Engineering: End-to-end ML pipelines using Vertex AI / Kubeflow Programming: Python & SQL MLOps: CI/CD for ML, Model deployment & monitoring Infrastructure-as-Code: Terraform Data Engineering: ETL/ELT, real-time & batch pipelines AI/ML Tools: TensorFlow, scikit-learn, XGBoos Secondary Skills GCP Certifications: Professional Data Engineer or ML Engineer Data Tools: Looker, Dataform, Data Catalog AI Governance: Model explainability, privacy, compliance (e.g., GDPR, fairness) GCP Partner Experience: Prior involvement in specialization journey or partner enablement Work Location - Remote What Makes Techjays An Inspiring Place To Work At Techjays, we are driving the future of artificial intelligence with a bold mission to empower businesses worldwide by helping them build AI solutions that transform industries. As an established leader in the AI space, we combine deep expertise with a collaborative, agile approach to deliver impactful technology that drives meaningful change. Our global team consists of professionals who have honed their skills at leading companies such as Google, Akamai, NetApp, ADP, Cognizant Consulting, and Capgemini. With engineering teams across the globe, we deliver tailored AI software and services to clients ranging from startups to large-scale enterprises. Be part of a company that’s pushing the boundaries of digital transformation. At Techjays, you’ll work on exciting projects that redefine industries, innovate with the latest technologies, and contribute to solutions that make a real-world impact. Join us on our journey to shape the future with AI. We are seeking a Senior Data & AI/ML Engineer with deep expertise in GCP, who will not only build intelligent and scalable data solutions but also champion our internal capability building and partner-level excellence. This is a high-impact role for a seasoned engineer who thrives in designing GCP-native AI/ML-enabled data platforms. You’ll play a dual role as a hands-on technical lead and a strategic enabler, helping drive our Google Cloud Data & AI/ML specialization track forward through successful implementations, reusable assets, and internal skill development. Preferred Qualification GCP Professional Certifications: Data Engineer or Machine Learning Engineer. Experience contributing to a GCP Partner specialization journey. Familiarity with Looker, Data Catalog, Dataform, or other GCP data ecosystem tools. Knowledge of data privacy, model explainability, and AI governance is a plus. Work Location: Remote Key Responsibilities Data & AI/ML Architecture Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage. Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines. Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry. Define and implement data governance, lineage, monitoring, and quality frameworks. Google Cloud Partner Enablement Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions. Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP. Contribute to building repeatable solution accelerators in Data & AI/ML. Work with the leadership team to align with Google Cloud Partner Program metrics. Team Development Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning. Organize and lead internal GCP AI/ML enablement sessions. Represent the company in Google partner ecosystem events, tech talks, and joint GTM engagements. What We Offer Best-in-class packages. Paid holidays and flexible time-off policies. Casual dress code and a flexible working environment. Opportunities for professional development in an engaging, fast-paced environment. Medical insurance covering self and family up to 4 lakhs per person. Diverse and multicultural work environment. Be part of an innovation-driven culture with ample support and resources to succeed. Show more Show less

Posted 3 days ago

Apply

14.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. The Manager of Data Engineering is a leader with a passion for leveraging technical solutions to address business challenges. This role requires deep expertise in agile development practices and a commitment to extreme ownership of software products and platforms. The ideal candidate will collaborate closely with Product Managers and Engineering teams to drive both transformational and operational initiatives. Key Responsibilities Would Include Motivate a high-performing team to achieve outstanding results while fostering individual growth and development. Proven track record of leading teams that have successfully built and launched products in highly scalable growth markets. Experience in building teams and leading organizational change within a fast-paced and highly regarded technology company. Partners with business product leadership to develop strategies and technology roadmaps. Positive client engagement at senior levels, managing and navigating relationships with peers, business line leaders, and engagement across the Protect matrix of services. IT solution development that aligns with the company's strategic architecture. Manage vendor solutions and deliverables. Responsible for the delivery of complex Data Engineering projects, including data marts, data lakes, data interchanges, and data warehouses. Responsible for managing the costs, optimizations, and overall cost strategies for cloud computing in your domain. Experience Job Qualifications: Total experience must be in range of 14-16 years. 10+ years of progressive experience in Data Engineering. Must be having 5+ years of exp. in MSSQL, Data Flow, GCP and Snowflake 5 – 8 years of experience leading technology teams. Proven success in delivering complex, high-impact projects with measurable business outcomes. Minimum 3 years of support in cloud-based data technologies. Education A BS in Computer Science or an equivalent degree is highly preferred. Technical Expertise Experience developing or leading development teams using the following technologies (or similar): Google Dataflow, Google Airflow, Microsoft Sql Server Integration Services (SSIS), PostgreSQL, Google BigQuery, Google Cloud Platform, REST Services, Data Visualization tools (e.g. PowerBI, SSRS) Industry Experience: Background in the Property and Casualty Insurance industry is preferred. Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less

Posted 3 days ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

What you’ll do: As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What experience you need: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What could set you apart: Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

🚀 We're Hiring: Senior Software Engineer – GCP | Python | Angular We’re looking for a highly skilled and passionate Software Engineer to join our fast-paced, product-focused engineering team. In this role, you’ll be involved in end-to-end development — from design and implementation to testing, deployment, and support. If you thrive in a modern cloud-native, CI/CD-driven development environment and enjoy working on impactful features across the stack, we’d love to hear from you! 📍 Location: Chennai 💼 Join a cutting-edge digital team powering innovation for a globally renowned automotive and mobility leader. 🔧 Key Skills Required : Languages & Frameworks : Python, Java, JavaScript (Node.js), Angular, RESTful APIs Cloud & DevOps : Google Cloud Platform (GCP), Cloud Run, BigQuery, Git, Jenkins, CI/CD Data & Infrastructure : Dataflow, Terraform, Airflow Testing & Best Practices : Jest, Mocha, TDD, Clean Code, Design Patterns 👤 Experience & Qualifications : 5+ years of professional software development experience Bachelor’s degree in Computer Science, Engineering, or related field Experience building scalable full-stack solutions in a cloud environment Strong understanding of Agile, CI/CD, and DevOps practices ✨ Why Join Us? Work on cutting-edge tech including LLM integrations Be part of a team that values quality, ownership, and innovation Collaborate across product, engineering, and DevOps in a cloud-native setup 📩 Interested? Drop your profile or DM for a quick conversation. 📌 Immediate to 30 days joiners preferred #Hiring #SoftwareEngineer #FullStackDeveloper #Python #Angular #GCP #CloudRun #BigQuery #DevOps #LLM #CI/CD #ImmediateJoiners #ChennaiJobs #GoogleCloud Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a skilled Database Migration Specialist with deep expertise in mainframe modernization and data migration to cloud platforms such as AWS, Azure, or GCP . The ideal candidate will have hands-on experience migrating legacy systems (COBOL, DB2, IMS, VSAM, etc.) to modern cloud-native databases like PostgreSQL, Oracle, or NoSQL . What will your job look like? Lead and execute end-to-end mainframe-to-cloud database migration projects. Analyze legacy systems (z/OS, Unisys) and design modern data architectures. Extract, transform, and load (ETL) complex datasets ensuring data integrity and taxonomy alignment. Collaborate with cloud architects and application teams to ensure seamless integration. Optimize performance and scalability of migrated databases. Document migration processes, tools, and best practices. Required Skills & Experience 5+ years in mainframe systems (COBOL, CICS, DB2, IMS, JCL, VSAM, Datacom). Proven experience in cloud migration (AWS DMS, Azure Data Factory, GCP Dataflow, etc.). Strong knowledge of ETL tools , data modeling, and schema conversion. Experience with PostgreSQL, Oracle, or other cloud-native databases . Familiarity with data governance , security, and compliance in cloud environments. Excellent problem-solving and communication skills. Show more Show less

Posted 3 days ago

Apply

1.0 years

0 Lacs

India

On-site

Job description : 1. Provide counselling and guidance to students and help them choose the most appropriate courses. 2. Maintain student records and admission details. 3. Handle student queries before and after the admission process. 4. Handling dataflow, exam registration for medical overseas licensure examinations. 5. Manage daily activity report 6. Maintain master data of customers and clients 7. Ensure the smooth flow of business activities Qualification : 1. Freshers can apply **** 2. Excellent communication skills ( English & Malayalam ) 3. Proficient with Microsoft Office, Excel 4. Customer Handling Skills 5. Excellent record keeping skills. 6. Ability to multitask and prioritize daily workload Please share your updated resume with photo Job Type: Full-time Pay: From ₹12,000.00 per month Schedule: Day shift Education: Bachelor's (Required) Experience: Admin: 1 year (Required) Work Location: In person

Posted 3 days ago

Apply

7.0 - 9.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

The purpose of this role is to understand, model and facilitate change in a significant area of the business and technology portfolio either by line of business, geography or specific architecture domain whilst building the overall Architecture capability and knowledge base of the company. Job Description: Role Overview : We are seeking a highly skilled and motivated Cloud Data Engineering Manager to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. The GCP Data Engineering Manager will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs. Key Responsibilities : Data Engineering & Development : Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. Develop and optimize data architectures that support real-time and batch data processing. Build, optimize, and maintain CI/CD pipelines using tools like Jenkins, GitLab, or Google Cloud Build. Automate testing, integration, and deployment processes to ensure fast and reliable software delivery. Cloud Infrastructure Management : Manage and deploy GCP infrastructure components to enable seamless data workflows. Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices. Infrastructure Automation and Management: Design, deploy, and maintain scalable and secure infrastructure on GCP. Implement Infrastructure as Code (IaC) using tools like Terraform. Manage Kubernetes clusters (GKE) for containerized workloads. Collaboration and Stakeholder Engagement : Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals. Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation. Quality Assurance & Optimization : Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations. Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines. Monitor and optimize pipeline performance to meet SLAs and minimize operational costs. Qualifications and Certifications : Education: Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field. Experience: Minimum of 7 to 9 years of experience in data engineering, with at least 4 years working on GCP cloud platforms. Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine and Kubernetes Engine (GKE). Hands-on experience with CI/CD tools such as Jenkins, GitHub or Bitbucket. Proficiency in Docker, Kubernetes, Terraform or Ansible for containerization, orchestration, and infrastructure as code (IaC) Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer Strong understanding of Agile/Scrum methodologies Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

Job Summary/Overview: We are seeking a highly experienced and skilled Senior GCP Data Engineer to design, develop, and maintain data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). This role requires a strong understanding of data engineering principles and a proven track record of success in building and managing large-scale data solutions. The ideal candidate will be proficient in various GCP services and have experience working with large datasets. Key Responsibilities: * Design, develop, and implement robust and scalable data pipelines using GCP services. * Develop and maintain data warehousing solutions on GCP. * Perform data modeling, ETL processes, and data quality assurance. * Optimize data pipeline performance and efficiency. * Collaborate with other engineers and stakeholders to define data requirements and solutions. * Troubleshoot and resolve data-related issues. * Contribute to the development and improvement of data engineering best practices. * Participate in code reviews and ensure code quality. * Document technical designs and processes. Required Qualifications: * Bachelor's degree in Computer Science, Engineering, or a related field. * 7+ years of experience as a Data Engineer. * Extensive experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Pub/Sub. * Proven experience designing and implementing data pipelines using ETL/ELT processes. * Experience with data warehousing concepts and best practices. * Strong SQL and data modeling skills. * Experience working with large datasets. Preferred Qualifications: * Master's degree in Computer Science, Engineering, or a related field. * Experience with data visualization tools. * Experience with data governance and compliance. * Experience with containerization technologies (e.g., Docker, Kubernetes). * Experience with Apache Kafka or similar message queuing systems. Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

🚀 We’re Hiring: AEP Data Engineer 📍 Location: Remote | 💼 Experience: 6–8 Years 🕒 Contract: 6 Months (Extendable) Prior experience with Adobe Experience Platform (AEP) is a plus ! We’re looking for an experienced Data Engineer with strong GCP (Google Cloud Platform) skills and a background in ETL development and data warehouse migration . 🔧 What You’ll Do: Design and build scalable ETL pipelines and data workflows in GCP Migrate on-premise data warehouses to BigQuery and other GCP tools Collaborate with architects, data scientists, and stakeholders to deliver reliable data solutions Optimize performance, maintain data quality, and ensure smooth operations Participate in code reviews , CI/CD workflows, and Agile ceremonies. 🎯 What You Bring: 6–8 years in Data Engineering 3–4 years of hands-on experience in GCP tools: BigQuery, Dataflow, Cloud Composer, Pub/Sub Strong in SQL and Python (or similar language) Solid experience with ETL frameworks and data migration strategies Proficiency in version control (Git) and working in remote agile teams Excellent communication and ability to work independently AEP knowledge is a big plus 📩 Apply now or share your resume at Recruiter@systembender.com Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary Skill Name: Power BI with GCP developer Experience : 7 - 10 yrs Mandatory Skills : Power BI + GCP(Big Query) Required Skills & Qualifications: Power BI Expertise: Strong hands-on experience in Power BI development, including report/dashboard creation, DAX, Power Query, and custom visualizations. Semantic Model Knowledge: Proficiency in building and managing semantic models within Power BI to ensure consistency and user-friendly data exploration. GCP Tools: Practical experience with Google Cloud Platform tools, particularly BigQuery, Dataflow, and Cloud Storage, for managing large datasets and data integration. ETL Processes: Experience in designing and managing ETL (Extract, Transform, Load) processes using GCP services. SQL & Data Modeling: Solid skills in SQL and data modeling, particularly for BI solutions and creating relationships between different data sources. Cloud Data Integration: Familiarity with integrating cloud-based data sources into Power BI, including knowledge of best practices for handling cloud storage and data pipelines. Data Analysis & Troubleshooting: Strong problem-solving abilities, including diagnosing and resolving issues in data models, reports, or data integration pipelines. Communication & Collaboration: Excellent communication skills to work effectively with cross Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Role Overview As a Principal Software Engineer, you will be a key contributor to the design, development, and deployment of advanced AI and generative AI-based products. You will drive technical innovation, lead complex projects, and collaborate closely with cross-functional teams to deliver high-quality, scalable, and maintainable solutions. This role requires a strong background in software development, AI/ML techniques, and DevOps practices, along with the ability to mentor junior engineers and contribute to strategic technical decisions. Responsibilities Advanced Software Development: Design, develop, and optimize high-quality code for complex software applications and systems, maintaining high standards of performance, scalability, and maintainability. Drive best practices in code quality, documentation, and test coverage. GenAI Product Development: Lead end-to-end development of generative AI solutions, from data collection and model training to deployment and optimization. Experiment with cutting-edge generative AI techniques to enhance product capabilities and performance. Technical Leadership: Take ownership of architecture and technical decisions for AI/ML projects. Mentor junior engineers, review code for adherence to best practices, and ensure the team follows a high standard of technical excellence. Project Ownership: Lead execution and delivery of features, managing project scope, timelines, and priorities in collaboration with product managers. Proactively identify and mitigate risks to ensure successful, on-time project completion. Architectural Design: Contribute to the architectural design and planning of new features, ensuring solutions are scalable, reliable, and maintainable. Engage in technical reviews with peers and stakeholders, promoting a product suite mindset. Code Review & Best Practices: Conduct rigorous code reviews to ensure adherence to industry best practices in coding standards, maintainability, and performance optimization. Provide feedback that supports team growth and technical improvement. Testing & Quality Assurance: Design and implement robust test suites to ensure code quality and system reliability. Advocate for test automation and the use of CI/CD pipelines to streamline testing processes and maintain service health. Service Health & Reliability: Monitor and maintain the health of deployed services, utilizing telemetry and performance indicators to proactively address potential issues. Perform root cause analysis for incidents and drive preventive measures for improved system reliability. DevOps Ownership: Take end-to-end responsibility for features and services, working in a DevOps model to deploy and manage software in production. Ensure efficient incident response and maintain a high level of service availability. Documentation & Knowledge Sharing: Create and maintain thorough documentation for code, processes, and technical decisions. Contribute to knowledge sharing within the team, enabling continuous learning and improvement. Minimum Qualifications Educational Background: Bachelor’s degree in Computer Science, Engineering, or a related technical field; Master’s degree preferred. Experience: 6+ years of professional software development experience, including significant experience with AI/ML or GenAI applications. Demonstrated expertise in building scalable, production-grade software solutions. Technical Expertise: Advanced proficiency in Python, FastAPI, PyTest, Celery, and other Python frameworks. Deep knowledge of software design patterns, object-oriented programming,and concurrency. Cloud & DevOps Proficiency: Extensive experience with cloud technologies (e.g., GCP,AWS, Azure), containerization (e.g., Docker, Kubernetes), and CI/CD practices. Strong understanding of version control systems (e.g., GitHub) and work tracking tools (e.g., JIRA). AI/GenAI Knowledge: Familiarity with GenAI frameworks (e.g., LangChain, LangGraph), MLOps, and AI lifecycle management. Experience with model deployment and monitoring in cloud environments. Preferred Qualifications AI & Machine Learning: Hands-on experience with advanced ML algorithms, including generative models, NLP, and transformers. Knowledge of industry-standard AI frameworks (e.g.,TensorFlow, PyTorch) and experience with data preprocessing and model evaluation. Data & Analytics Tools: Proficiency with relational and NoSQL databases (e.g., MongoDB,MSSQL, PostgreSQL) and analytics platforms (e.g., BigQuery, Snowflake, Tableau). Experience with messaging systems (e.g., Kafka) is a plus. Testing & Quality: Experience with test automation tools (e.g., PyTest, xUnit) and CI/CD tooling such as Terraform and GitHub Actions. Strong emphasis on building resilient and testable software. Advanced Cloud Knowledge: Proficiency with GCP technologies such as VertexAI, BigQuery, GKE, GCS, and DataFlow, with a focus on deploying AI models at scale Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Coders Brain is a global leader in IT services, digital and business solutions that partners with clients to simplify, strengthen, and transform their businesses. The company ensures high levels of certainty and satisfaction through deep industry expertise and a global network of innovation and delivery centers. Job Title: Senior Data Engineer Location: Hyderabad Experience: 6+ Years Employment Type: Full-Time Job Summary: We are looking for a highly skilled Senior Data Engineer to join our Data Engineering team. You will play a key role in designing, implementing, and optimizing robust, scalable data solutions that drive business decisions for our clients. This position involves hands-on development of data pipelines, cloud data platforms, and analytics tools using cutting-edge technologies. Key Responsibilities: Design and build reliable, scalable, and high-performance data pipelines to ingest, transform, and store data from various sources. Develop cloud-based data infrastructure using platforms such as AWS , Azure , or Google Cloud Platform (GCP) . Optimize data processing and storage frameworks for cost efficiency and performance. Ensure high standards for data quality, integrity, and governance across all systems. Collaborate with cross-functional teams including data scientists, analysts, and product managers to translate requirements into technical solutions. Troubleshoot and resolve issues with data pipelines and workflows, ensuring system reliability and availability. Stay current with emerging trends and technologies in big data and cloud ecosystems and recommend improvements accordingly. Required Qualifications: Bachelor’s degree in Computer Science , Software Engineering , or a related field. Minimum 6 years of professional experience in data engineering or a related discipline. Proficiency in Python , Java , or Scala for data engineering tasks. Strong expertise in SQL and hands-on experience with modern data warehouses (e.g., Snowflake, Redshift, BigQuery). In-depth knowledge of big data technologies such as Hadoop , Spark , or Hive . Practical experience with cloud-based data platforms such as AWS (e.g., Glue, EMR) , Azure (e.g., Data Factory, Synapse) , or GCP (e.g., Dataflow, BigQuery) . Excellent analytical, problem-solving, and communication skills. Nice to Have: Experience with containerization and orchestration tools such as Docker and Kubernetes . Familiarity with CI/CD pipelines for data workflows. Knowledge of data governance, security, and compliance best practices. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Consultant/Senior Consultant/Manager in our Technology & Transformation you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - To do this, following are the desired qualification and required skills: Good hands-on experience in GCP services including Big Query, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer/Airflow, and IAM. Must have proficient experience in GCP Databases : Bigtable, Spanner, Cloud SQL and Alloy DB Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers) Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle. Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs. Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Ability to work independently and manage multiple priorities effectively. Preferably having expertise in end to end DW implementation UG: B. Tech /B.E. in Any Specialization. Location and way of working: Base location: Bengaluru/Hyderabad/Mumbai/Bhubaneshwar/Coimbatore/Delhi This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Consultant/Senior Consultant/Manager: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Consultant/Senior Consultant/Manager across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviors' and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterized by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognize there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less

Posted 4 days ago

Apply

15.0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Educational Qualifications: BE/B Tech/ M.E/M To lead the operations of UIDAI's critical infrastructure, primarily hosted on Open-stack on-premise Private Cloud architecture, ensuring 24/7 availability of Aadhaar services. Manage a team of experts to design application deployment architecture to ensure high availability. Manage a team of experts to provide infra-deployment guidelines to bake into app design. Ensure robust security, scalability, and reliability of UIDAI's data centres and networks. Participate in architectural design review sessions, develop proof of concepts/pilots, implement projects, and deliver ongoing upgrades and enhancements. Revamp applications for AADHAR's Private Cloud Deployment in today's constantly shifting digital landscape to increase operational efficiency and reduce infrastructure costs. Role & Innovation & Technology Transformation Align with the Vision, Mission and Core Values UIDAI while closely aligning with inter-disciplinary teams. Lead Cloud Operations/ Infra team in fine-tuning & optimization of cloud-native platforms to improve performance and to achieve cost efficiency. Drive solution design for RFPs, POCs, and pilots for new and upcoming projects or R&D initiatives, using open-source cloud and infrastructure to build a scalable and elastic Data Center. Encourage & create an environment for Knowledge sharing within and outside the UIDAI. To interact/ partner with leading institutes/ R&D establishments/ educational institutions to stay up to date with new technologies and trends in cloud computing. Be a thought leader in architecture design and development of complex operational data analytics solutions to monitor various metrics related to infrastructure and app Architecture Design & the design, implementation, and deployment of OpenStack-based on-premise private cloud infrastructure. Develop scalable, secure, and highly available cloud architectures to meet business and operational needs. Architect and design infrastructure solutions that support both virtualized and containerized workloads. Solution Integration, Performance Monitoring & Integrate OpenStack with existing on-premise data centre systems, network infrastructure, and storage platforms. Work with cross-functional teams to ensure seamless integration of cloud solutions in UIDAI. Monitor cloud infrastructure performance and ensure efficient use of resources. Identify areas for improvement and implement optimizations to reduce costs and improve performance. Security & Compliance: - Implement security best practices for on-premise cloud environments, ensuring data protection and compliance with industry standards. Regularly perform security audits and vulnerability assessments to maintain a secure cloud Collaboration & Collaborate with internal teams (App development and Security) to align cloud infrastructure with UIDAIs requirements and objectives & manage seamless communication within tech teams and across the organization. Maintain detailed live documentation of cloud architecture, processes, and configurations to establish trails of decision-making and ensure transparency and accountability. Role More than 15 years of experience in Technical, Infra and App Solutioning, and at least 7+ years of experience in spearheading large multi-disciplinary technology teams working across various domains in a leadership position. Excellent problem-solving and troubleshooting skills. Must have demonstrable experience in application performance analysis through low-level debugging. Experience on transformation projects for On-Premise data solutions, open-source CMP - OpenStack, CloudStack. Should be well versed with Site Reliability Engineering (SRE) concepts with a focus on extreme automation & infrastructure as code (IaC) methodologies & have led such teams before; including exp on Gitops, and platform automation tools like Terraform, Ansible etc. Strong knowledge of Linux-based operating systems (Ubuntu, CentOS, RedHat, etc). Strong understanding on the HTTP1.1, Http2 with gRPC and HTTP/2 (QUICK) protocol functioning. Experience in System Administration, Server storage, Networking, virtualization, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on the Cloud. Proficient in technology administration, remote infrastructure management, cloud assessment, QA, monitoring, and DevOps practices. Extensive experience in Cloud platform architecture, Private cloud deployment, large-scale ] transformation or migration of applications to cloud-native platforms. Should have experience in building cloud-native platforms on Kubernetes, including awareness & experience of service mesh, cloud-native storage, integration with SAN & NAS, Kubernetes operators, CNI, CSI, CRI etc. Should have strong expertise in networking background in terms of routing, switching, BGP, technologies like TRILL, MP-BGP, EVPN etc. Preferably, should have experience in SAN networking & Linux networking concepts like networking namespaces, route tables, and ss utilities. Experience on Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server. Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. Experience in ML Ops pipeline is preferable. Experience with distributed computing platforms and enterprise environments like Hadoop, GCP/AWS/Azure Cloud is preferred. Experience with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, and Dataflow is preferred. (ref:iimjobs.com) Show more Show less

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies