Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Candescent is the largest non-core digital banking provider. We bring together the transformative technologies that power and connect account opening, digital banking and branch solutions for banks and credit unions of all sizes on any core. Our Candescent solutions power the top three U.S. mobile banking apps and are trusted by banks and credit unions of all sizes. We offer an extensive portfolio of industry-leading products and services with an extensible ecosystem of out-of-the-box and integrated partner solutions. In addition, our API-first architecture and developer tools enable financial institutions to optimize and expand upon their existing capabilities by seamlessly integrating custom-built or third-party solutions. And our connected in-person, remote and digital experiences reinvent customer service across all channels. Self-service configuration and marketing tools give financial institutions greater control of their branding, targeted messaging and overall user experience. And data-driven analytics and reporting tools provide valuable insights to help drive continued growth and profitability. From conversions and implementations to custom development and customer care, our clients get expert, end-to-end support at every step. Software Engineer IV - Java, Spring Webflux, and GCP Job Summary: As a Software Engineer III, you will design and develop cloud-native applications using Google Cloud Platform (GCP) services and various open-source frameworks. You will be responsible for debugging and fixing issues in Java applications with reactive programming and deployment stacks on GCP. This role requires hands-on experience with containers and a deep understanding of GCP services, as well as expertise in Spring Webflux or reactive programming. Cloud-Native Application Design: Design and develop cloud-native applications using GCP services and open-source frameworks. Debugging and Issue Resolution: Debug and fix issues in Java applications with reactive programming and deployment stacks on GCP. Reactive Programming: Utilize Spring Webflux or other reactive programming frameworks to build scalable and efficient applications. Container Management: Utilize containers, including Docker and Kubernetes, on any cloud technologies. GCP Services: Leverage GCP services such as Dataflow, Pub/Sub, Compute Engine, Kubernetes Engine, Filestore, Cloud SQL, and Bigtable. Source Control Management: Use SCM technologies like Git for version control. Database Management: Work with SQL (MS SQL Server, Oracle) or NoSQL databases. Agile Methodologies: Collaborate with teams using Agile methodologies, demonstrating the ability to quickly reproduce, diagnose, and troubleshoot complex problems. Lean and Agile Practices: Apply lean and agile methods of software delivery, including BDD, ATDD, and TDD. Infrastructure as Code: Implement Terraform automation for infrastructure as code, and use GitHub CI/CD tooling and GCP Cloud Build. Scrum/Agile Knowledge: Demonstrate knowledge of Scrum/Agile methodology. Qualifications: Experience: 10+ years of experience in software development with a focus on Java and reactive programming. Domain Expertise: Strong knowledge and experience in the payments domain. Technical Skills: Proficiency in Java, Spring Webflux, and GCP services. Certifications: Relevant certifications such as Google Cloud Professional Cloud Developer or equivalent are a plus. Education: Bachelor’s degree in computer science, Engineering, or a related field. Skills: Problem-Solving: Strong analytical and problem-solving skills to identify and resolve issues effectively. Communication: Excellent communication skills to collaborate with cross-functional teams and stakeholders. Attention to Detail: High attention to detail to ensure thorough testing and quality assurance. Offers of employment are conditional upon passage of screening criteria applicable to the job. EEO Statement Integrated into our shared values is Candescent’s commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. Candescent is committed to being a globally inclusive company where all people are treated fairly, recognized for their individuality, promoted based on performance and encouraged to strive to reach their full potential. We believe in understanding and respecting differences among all people. Every individual at Candescent has an ongoing responsibility to respect and support a globally diverse environment. Statement to Third Party Agencies To ALL recruitment agencies: Candescent only accepts resumes from agencies on the preferred supplier list. Please do not forward resumes to our applicant tracking system, Candescent employees, or any Candescent facility. Candescent is not responsible for any fees or charges associated with unsolicited resumes.
Posted 3 weeks ago
100.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
DataVisualizationEngineer Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE: CAH) is a global, integrated healthcare services and products company connecting patients, providers, payers, pharmacists and manufacturers for integrated care coordination and better patient management. Backed by nearly 100 years of experience, with more than 50,000 employees in nearly 60 countries, Cardinal Health ranks among the top 20 on the Fortune 500. Department Overview augmented Intelligence ( augIntel ) builds automation, analytics and artificial intelligence solutions that drive success for Cardinal Health by creating material savings, efficiencies and revenue growth opportunities. The team drives business innovation by leveraging emerging technologies and turning them into differentiating business capabilities. Job Overview Designing, building and operationalizing visualization and user interface (UI) solutions (department, business unit, and enterprise level) leveraging data sourced from Google Cloud Platform and AtScale. Expect high proficiency in visualization (i.e. Tableau, Looker, Excel) and user interface concepts and design leveraging best in class tools to represent and provide access to insights in data repositories, machine learning models, and artificial intelligence applications. Responsibilities Designing and implementing data visualization and user interfaces leveraging tools that connect to GCP cloud tools and services as well as AtScale Designing and building visualizations in conjunction with machine learning algorithms (process and outputs) and providing visibility to next best actions to take Working with both business users and developers, draft visualization requirements and provide innovative ideas that are well beyond reporting “what happened?” Analyzing, re-architecting and re-platforming existing visualization that may point to local data sources or databases and recreating in GCP Re-engineering business logic imbedded in existing visualizations that need to be integrated into GCP data stores to enable sharing across other business functions Designing and implementing data pipelines, ingestion and curation functions on GCP cloud using GCP native or custom programming. Desired Qualifications: Experience architecting and implementing data visualizations that visually present complex data and/or metric relationships to users for exploration, analysis and action Experience building wireframes, mock-ups, and ad-hoc visualizations/web pages that present the “art of the possible” for users to interact with and provide direction Actively collaborates with data engineers, modelers, data scientists and business leaders to provide design input and rapid prototypes Strong customer focus and interaction to listen to users, explore the data, and present multiple alternatives to meet solution objectives. Works on and may lead complex activities of large data scope areas, providing solutions which may set a precedent 3+ years developing business-focused visualizations and user interfaces (Tableau, Excel, Looker) Experience connecting visualizations to data models in GCP (BigQuery, BigTable, AtScale) Experience in designing and optimizing data models on GCP cloud using GCP data stores such as BigQuery Hands-on experience with Data Ingestion technologies like GCP DataFlow, Fusion, and AirFlow Experience in writing complex SQL queries, stored procedures. Architecting and implementing data governance and security for data platforms on GCP Agile development skills and experience. Experience with CI/CD pipelines such as Concourse, Jenkins Tableau/ Google Cloud certification is a plus Candidates who are back-to-work, people with disabilities, without a college degree, and Veterans are encouraged to apply. Cardinal Health supports an inclusive workplace that values diversity of thought, experience and background. We celebrate the power of our differences to create better solutions for our customers by ensuring employees can be their authentic selves each day. Cardinal Health is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, ancestry, age, physical or mental disability, sex, sexual orientation, gender identity/expression, pregnancy, veteran status, marital status, creed, status with regard to public assistance, genetic status or any other status protected by federal, state or local law.
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Us: OpZen is an innovative early-stage startup founded by a team of visionary entrepreneurs with a stellar track record of building successful ventures such as Mitchell Madison, Opera Solutions, and Zenon. Our mission is to revolutionize the finance industry through the creation of groundbreaking AI-driven products and the provision of elite consulting services. We are committed to harnessing the power of advanced technology to deliver transformative solutions that drive unparalleled efficiency, foster innovation, and spur growth for our clients. Join us on our exciting journey to redefine the future of finance and leave an indelible mark on the industry. Role: Lead/Manager Overview: Overview: We are seeking a visionary and dynamic individual to lead our AI initiatives and data-driven strategies. This role is crucial in shaping the future of our company by leveraging advanced technologies to drive innovation and growth. The ideal candidate will possess a deep understanding of AI, machine learning, and data analytics, along with a proven track record in leadership and strategic execution. Key Responsibilities: Self-Driven Initiative: Take ownership of projects and drive them to successful completion with minimal supervision, demonstrating a proactive and entrepreneurial mindset. Stakeholder Communication: Present insights, findings, and strategic recommendations to senior management and key stakeholders, fostering a data-driven decision-making culture. Executive Collaboration: Report directly to the founders and collaborate with other senior leaders to shape the company's direction and achieve our ambitious goals. Innovation & Problem-Solving: Foster a culture of innovative thinking and creative problem-solving to tackle complex challenges and drive continuous improvement. AI Research & Development: Oversee AI research and development initiatives, ensuring the integration of cutting-edge technologies and methodologies. Data Management: Ensure effective data collection, management, and analysis to support AI-driven decision-making and product development. Required Skills and Qualifications: Bachelor's degree from a Tier 1 institution or an MBA from a recognized institution. Proven experience in a managerial role, preferably in a startup environment. Strong leadership and team management skills. Excellent strategic thinking and problem-solving abilities. Exceptional communication and interpersonal skills. Ability to thrive in a fast-paced, dynamic environment. Entrepreneurial mindset with a passion for innovation and growth. Extensive experience with AI technologies, machine learning, and data analytics. Proficiency in programming languages such as Python, R, or similar. Familiarity with data visualization tools like Tableau, Power BI, or similar. Strong understanding of data governance, privacy, and security best practices. Technical Skills: Machine Learning Frameworks: Expertise in frameworks such as TensorFlow, PyTorch, or Scikit-learn. Data Processing: Proficiency in using tools like Apache Kafka, Apache Flink, or Apache Beam for real-time data processing. Database Management: Experience with SQL and NoSQL databases, including MySQL, PostgreSQL, MongoDB, or Cassandra. Big Data Technologies: Hands-on experience with Hadoop, Spark, Hive, or similar big data technologies. Cloud Computing: Strong knowledge of cloud services and infrastructure, including AWS (S3, EC2, SageMaker), Google Cloud (BigQuery, Dataflow), or Azure (Data Lake, Machine Learning). DevOps and MLOps: Familiarity with CI/CD pipelines, containerization (Docker, Kubernetes), and orchestration tools for deploying and managing machine learning models. Data Visualization: Advanced skills in data visualization tools such as Tableau, Power BI, or D3.js to create insightful and interactive dashboards. Natural Language Processing (NLP): Experience with NLP techniques and tools like NLTK, SpaCy, or BERT for text analysis and processing. Large Language Models (LLMs): Proficiency in working with LLMs such as GPT-3, GPT-4, or similar for natural language understanding and generation tasks. Computer Vision: Knowledge of computer vision technologies and libraries such as OpenCV, YOLO, or TensorFlow Object Detection API. Preferred Experience: Proven Track Record: Demonstrated success in scaling businesses or leading teams through significant growth phases, showcasing your ability to drive impactful results. AI Expertise: Deep familiarity with the latest AI tools and technologies, including Generative AI applications, with a passion for staying at the forefront of technological advancements. Startup Savvy: Hands-on experience in early-stage startups, with a proven ability to navigate the unique challenges and seize the opportunities that come with building a company from the ground up. Finance Industry Insight: Extensive experience in the finance industry, with a comprehensive understanding of its dynamics, challenges, and opportunities, enabling you to drive innovation and deliver exceptional value to clients. Why Join Us: Opportunity to work closely with experienced founders and learn from their entrepreneurial journey. Make a significant impact on a growing company and shape its future. Collaborative and innovative work environment. Lucrative compensation package including competitive salary and equity options.
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Required Past Experience: 4 to 6 years of demonstrated relevant experience deploying, configuring, and supporting public cloud infrastructure (GCP as primary), IaaS, and PaaS. Experience in configuring and managing the GCP infrastructure environment components Foundation components: Networking (VPC, VPN, Interconnect, Firewall, and Routes), IAM, Folder Structure, Organization Policy, VPC Service Control, Security Command Centre, etc. Application Components: BigQuery, Cloud Composer, Cloud Storage, Google Kubernetes Engine (GKE), Compute Engine, Cloud SQL, Cloud Monitoring, Dataproc, Data Fusion, Big Table, Dataflow, etc. Operational Components: Audit Logs, Cloud Monitoring, Alerts, Billing Exports, etc. Security Components: KMS, Secrets Manager, etc. Experience with infrastructure automation using Terraform. Experience in designing and implementing CI/CD pipelines with Cloud Build, Jenkins, GitLab, Bitbucket Pipelines, etc., and source code management tools like Git. Experience with scriptin,g Shell Scriptin,g and Python Required Skills and Abilities: Mandatory Skills: GCP Networking & IAM, Terraform, Shell Scripting/Python Scripting, CI/CD Pipelines Secondary Skills: Composer, BigQuery, GKE, Dataproc, GCP Networking Good To Have – Certifications in any of the following: Cloud Devops Engineer, Cloud Security Engineer, Cloud Network Engineer Good verbal and written communication skills. Strong Team Player Thanks & regards Prashant Awasthi
Posted 3 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
WPP is the creative transformation company. We use the power of creativity to build better futures for our people, planet, clients, and communities. Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore. WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500. Our people are the key to our success. We're committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow. Why we're hiring: At WPP, technology is at the heart of everything we do, and it is WPP IT’s mission to enable everyone to collaborate, create and thrive. WPP IT is undergoing a significant transformation to modernise ways of working, shift to cloud and micro-service-based architectures, drive automation, digitise colleague and client experiences and deliver insight from WPP’s petabytes of data. WPP Media is the world’s leading media investment company responsible for more than $63B in annual media investment through agencies Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the outcomes-driven programmatic audience company, Xaxis and data and technology company Choreograph. WPP Media’s portfolio includes Data & Technology, Investment and Services, all united in a vision to shape the next era of media where advertising works better for people. By leveraging all the benefits of scale, the company innovates, differentiates and generates sustained value for our clients wherever they do business.The WPP Media IT team in WPP IT are the technology solutions partner for the WPP Media group of agencies and are accountable for co-ordinating and assuring end-to-end change delivery, managing the WPP Media IT technology life cycle and innovation pipeline. We are looking for a hands-on, technically strong Data Operations Lead to head our newly established Data Integration & Operations team in Chennai. This is a build-and-run role: you’ll help define how the team operates while leading day-to-day delivery. The team is part of the global Data & Measure function and is responsible for ensuring that our data products run efficiently, reliably, and consistently across platforms and markets. You will own the operational layer of our data products — including data ingestion, monitoring, deployment pipelines, automation, and support. This role requires deep technical knowledge of Azure and/or GCP, alongside the ability to lead and scale a growing team. What you'll be doing: Technical Ownership & Execution Lead a team responsible for data integration, ingestion, orchestration, and platform operations Build and maintain automated data pipelines using Azure Data Factory, GCP Dataflow/Composer, or equivalent tools Define and implement platform-wide monitoring, logging, and alerting Manage cloud environments, including access control, security, and deployment automation Operational Standardisation Create and roll out standard operating procedures, runbooks, onboarding guides, and automation patterns Ensure repeatable, scalable practices across all supported data products Define reusable deployment frameworks and templates for integration Platform Support & Performance Set up and manage SLAs, incident workflows, and escalation models Proactively identify and resolve operational risks in cloud-based data platforms Partner with development and product teams to ensure seamless transition from build to run Team Leadership Lead and mentor a new, growing team in Chennai Shape the team’s operating model, priorities, and capabilities Act as a subject matter expert and escalation point for technical operations What you'll need: Required Skills 7+ years in data operations, platform engineering, or data engineering Deep, hands-on experience in Azure and/or GCP environments Strong understanding of cloud-native data pipelines, architecture, and security Skilled in orchestration (e.g. ADF, Dataflow, Airflow), scripting (Python, Bash), and SQL Familiarity with DevOps practices, CI/CD, and infrastructure-as-code Proven experience managing production data platforms and support Ability to design operational frameworks from the ground up Demonstrated experience leading technical teams, including task prioritization, mentoring, and delivery oversight Preferred Skills Experience with tools like dbt, Azure Synapse, BigQuery, Databricks, etc. Exposure to BI environments (e.g. Power BI, Looker) Familiarity with global support models and tiered ticket handling Experience with documentation, enablement, and internal tooling Who you are: You're open : We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are open-minded: to new ideas, new partnerships, new ways of working. You're optimistic : We believe in the power of creativity, technology and talent to create brighter futures or our people, our clients and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected. You're extraordinary: we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day. What we'll give you: Passionate, inspired people – We aim to create a culture in which people can do extraordinary work. Scale and opportunity – We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry. Challenging and stimulating work – Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge? We believe the best work happens when we're together, fostering creativity, collaboration, and connection. That's why we’ve adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process. WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers. Please read our Privacy Notice (https://www.wpp.com/en/careers/wpp-privacy-policy-for-recruitment) for more information on how we process the information you provide.
Posted 3 weeks ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: GCP Data Architect Location: Madurai, Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director, Software engineering In this role, you will: A Lead Automation engineer with deep hands-on experience of Software Automation testing and Performance testing tools, practices, processes Need deep understanding of Desktop, Web, Data warehouse application, API Development, Design Patterns, SDLC, IAC tools, testing and site reliability engineering and related ways to design and develop automation framework Define and implement best practices for software automation testing, performance testing, framework, and patterns, including testing methodologies. Be a generalist with the breadth and depth of experience in CICD best practices and has core experience in testing (ie. TDD/BDD/Automated testing/Contract testing/API testing/Desktop/web apps, DW test automation) Able to see a problem or an opportunity with the ability to engineer a solution, be respected for what they deliver not just what they say, should think about the business impact of their work and has a holistic view to problem- solving Proven industry experience of running an Engineering team with focus on optimization of processes, introduction of new technologies, solving challenges, building strategy, business planning, governance and Stakeholder Management. Apply thinking to many problems across multiple technical domains and suggest way to solve the problems Contributes to architectural discussions by asking the right questions to ensure a solution matches the business needs Identify opportunities for system optimization, performance tuning, and scalability enhancements. Implement solutions to improve system efficiency and reliability. Excellent verbal and written communication skills to articulate technical concepts to both technical and non-technical stakeholders. Build Performance assurance procedures with the latest feasible tools and techniques, establish Performance test automation process to improve testing productivity. Responsible for end-to-end Software testing, performance testing and engineering life cycle - technical scoping, performance scripting, testing, and tuning. Analyse the test assessment results, provide recommendations to improve performance or save infrastructure costs. Represent at Scrum meetings and all other key project meetings and provide a single point of accountability and escalation for Performance testing within the scrum teams. Advise on needed infrastructure and Performance Engineering and testing guidelines & be responsible for performance risk assessment of various application features. Work with cross-functional team, opportunity to work with software product, development, and support teams, capable of handling tasks to accelerate the testing delivery and to improve the quality for Applications at HSBC. Able to provide support in product/application design from performance point of view. Able to communicate plans, status, and results as per target audience. Willing to adapt, learn innovative technologies/trades and be flexible to work on projects as demanded by business Define and implement best practices for software automation testing, including testing standards, test reviews, coverage, and testing methodologies, tractability between requirements and test cases. Prepare, develop and maintain test automation framework that can be used for software testing, performance testing., write automation test scripts, conduct reviews. Develop and execute regression, smoke, integration tests timely. Requirements To be successful in this role, you must meet the following requirements: Experience in software testing approaches on automation testing using Tosca, Selenium, cucumber BDD framework Experienced on writing test plans, test strategy, test data management includes test artifacts management for both automation and manual testing. Experience on setting up CI/CD pipeline and work experience on GitHub, Jenkins along with integration to cucumber and Jira. Experience in agile methodology and proven experience in working on agile projects. Experience in analysis of bug tracking, prioritizing and bug reporting with bug tracking tools. Experience in SQL, Unix, Control-M, ETL, Data Testing, API testing, API Automation using Rest Assured. Familiar with following performance testing tools. Micro Focus LoadRunner Enterprise (VuGen, Analysis, LRE OneLG), Protocols: HTTP/HTMP, CITRIX,JMETER, Postman, Insomnia Familiar with following observability tools -AppDynamics, New Relic, Splunk, Geneos., Datadog, Grafana Knowledge of following will be an added advantage -GitHub, Jenkins, Kubernetes, Jira & Confluence. Programming and scripting language skills in Java, Shell, Scala, Groovy, Python,WebLogic server administration. Familiar with BMC Control M tool. CICD tools – Ansible, AWS RO, G3 UNIX/Linux/Web monitors & performance analysis tools to diagnose and resolve performance issues. Experience of working in an Agile environment, "DevOps" team or a similar multi skilled team in a technically demanding function. Experience of working on performance testing and tuning of micro-services/APIs, Desktop applications, Webapps, Cloud Services, ETL Apps, database queries. Experience of writing/modifying performance testing scripts, Implementation & usage of automated tools for result analysis Experience of working on performance testing and tuning of Data warehouse applications doing batch processing on various stages of ETL and information delivery components Good to have skills: Knowledge on latest technology, tools like Python Scripting, Tricentis Toaca, Dataflow, Hive, DevOpS, REST API, Hadoop, Kafka framework, GCP, AWS, will be an added advantage You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist/ Consultant Specialist/Senior Software engineer/Software engineer (Based on number of years of experience and role) In this role, you will: We are seeking a highly skilled and experienced Senior Data Engineer with expertise in Java, Java 8, Microservices, Springboot 3.0.0, postgres, JPA, UI -React, Typescript, JS, Apache Flink, Apache Beam, MongoDB, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation. You will play a critical role in designing, developing, and maintaining scalable, high-performance data pipelines and cloud-native solutions, with a strong focus on real-time stream processing using Apache Flink Design, develop, and maintain real-time and batch data pipelines using Apache Flink and Apache Beam. Implement stateful stream processing, event-time handling, and windowing with Flink. Optimize Flink jobs for performance, scalability, and fault tolerance. Build scalable, high-performance applications using Java. Write clean, maintainable, and efficient code following best practices. Integrate Flink pipelines with external systems such as Kafka, HDFS, and NoSQL databases. Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution. Monitor and troubleshoot Airflow DAGs to ensure smooth operations. Leverage GCP services to build and deploy cloud-native solutions: Dataflow: Design and deploy real-time and batch data processing pipelines. BigQuery: Perform data analysis and optimize queries for large datasets. Pub/Sub: Implement messaging and event-driven architectures. GCS: Manage and optimize cloud storage for data pipelines. Composer: Orchestrate workflows using Apache Airflow on GCP. Deploy and manage containerized applications on Google Kubernetes Engine (GKE). Design Kubernetes manifests and Helm charts for deploying scalable and fault-tolerant applications. Design and manage NoSQL databases using MongoDB, including schema design, indexing, and query optimization. Ensure data consistency and performance for high-throughput applications. Use Python for scripting, automation, and building utility tools. Write Python scripts to interact with APIs, process data, and manage workflows. Architect distributed systems with a focus on scalability, reliability, and performance. Design fault-tolerant systems with high availability using best practices. Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions. Participate in code reviews, design discussions, and technical decision-making. Monitor production systems using tools like Stackdriver, Prometheus, or Grafana. Optimize resource usage and costs for GCP services and Kubernetes clusters. Requirements To be successful in this role, you should meet the following requirements: Strong proficiency in Java mentioned above with experience in building scalable and high-performance applications. Basic to intermediate knowledge of Python for scripting and automation. Hands-on experience with Apache Flink for real-time stream processing and batch processing. Knowledge of Flink’s state management, windowing, and event-time processing. Experience with Flink’s integration with GCP services. Knowledge of Apache Beam for unified batch and stream data processing. Proficiency in Apache Airflow for building and managing workflows. Experience with Composer on GCP is a plus. Cloud Platform Expertise: Strong experience with Google Cloud Platform (GCP) services: Dataflow, BigQuery, Pub/Sub, GCS, and Composer. Familiarity with GCP IAM, networking, and cost optimization. Hands-on experience with Docker for containerization. Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE). Expertise in MongoDB, including schema design, indexing, and query optimization. Familiarity with other NoSQL or relational databases is a plus. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in an agile environment and adapt to changing requirements. Experience with other stream processing frameworks like Apache Kafka Streams or Spark Streaming. Knowledge of other cloud platforms (AWS, Azure) is a plus. Familiarity with Helm charts for Kubernetes deployments. Experience with monitoring tools like Prometheus, Grafana, or Stackdriver. Knowledge of security best practices for cloud and Kubernetes environments. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Good knowledge in GCP, BigQuery, SQL Server, Postgres DB Knowledge in Datastream, Cloud Dataflow, Terraform, ETL tool, Writing procedures and functions ,Writing dynamic code , Performance tuning and complex queries , UNIX.
Posted 3 weeks ago
5.0 - 7.0 years
8 - 10 Lacs
Thiruvananthapuram
On-site
5 - 7 Years 1 Opening Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: UST is seeking a highly skilled and motivated Lead Data Engineer to join our Telecommunications vertical, leading impactful data engineering initiatives for US-based Telco clients. The ideal candidate will have 6–8 years of experience in designing and developing scalable data pipelines using Snowflake, Azure Data Factory, Azure Databricks. Proficiency in Python, PySpark, and advanced SQL is essential, with a strong focus on query optimization, performance tuning, and cost-effective architecture. A solid understanding of data integration, real-time and batch processing, and metadata management is required, along with experience in building robust ETL/ELT workflows. Candidates should demonstrate a strong commitment to data quality, validation, and consistency, with working knowledge of data governance, RBAC, encryption, and compliance frameworks considered a plus. Familiarity with Power BI or similar BI tools is also advantageous, enabling effective data visualization and storytelling. The role demands the ability to work in a dynamic, fast-paced environment, collaborating closely with stakeholders and cross-functional teams while also being capable of working independently. Strong communication skills and the ability to coordinate across multiple teams and stakeholders are critical for success. In addition to technical expertise, the candidate should bring experience in solution design and architecture planning, contributing to scalable and future-ready data platforms. A proactive mindset, eagerness to learn, and adaptability to the rapidly evolving data engineering landscape—including AI integration into data workflows—are highly valued. This is a leadership role that involves mentoring junior engineers, fostering innovation, and driving continuous improvement in data engineering practices. Skills Azure Databricks,Snowflake,python,Data Engineering About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 3 weeks ago
5.0 years
0 Lacs
Haryana
Remote
About Teramind Teramind is the leading platform for user behavior analytics, serving multiple use cases from insider risk mitigation to business process optimization. With our comprehensive suite of solutions, organizations gain unprecedented visibility into user activities while enhancing security, optimizing productivity, and ensuring compliance. Trusted by Fortune 500 companies and businesses of all sizes across industries, our innovative platform helps organizations protect sensitive data, maximize workforce performance, and create safer, more efficient digital workplaces. Through real-time monitoring and advanced analytics, we enable businesses to safeguard their most sensitive information while optimizing employee productivity in both in-office and remote work environments. Our Core Values At Teramind, our values drive everything we do. We embrace innovation as a fundamental principle, constantly pushing boundaries to improve our products, streamline processes, and enhance customer experiences. We foster resourcefulness by empowering our team members with the autonomy and confidence to solve problems independently while providing collaborative support when needed. As a globally inclusive organization, we celebrate diversity and create an adaptable work culture where respect and collaboration thrive across our international teams. Above all, we are committed to excellence, delivering the highest quality in every aspect of our work and consistently exceeding expectations in service to our clients and each other. About the Role As our AI Data Engineering Lead , you will be a pivotal figure in shaping and executing our AI/ML strategy. You will spearhead the design, development, and deployment of cutting-edge AI applications, with a strong focus on building data infrastructure to power Large Language Models (LLMs), agentic frameworks, and multi-agent systems. This role demands a blend of hands-on technical expertise, strategic thinking, and leadership to build and scale our AI capabilities on the Google Cloud Platform (GCP). You'll not only architect robust systems but also champion data engineering excellence and build and lead a team of talented AI engineers. What You’ll Do Architect & Build: Design scalable, AI-first data infrastructure on GCP (Vertex AI, BigQuery, Dataflow, Pub/Sub) to power LLMs and agentic systems. Pipeline Mastery: Develop high-performance, real-time data pipelines to process user behavior and drive ML systems. End-to-End AI Systems: Lead the design, development, and deployment of AI/LLM applications using LangChain, HuggingFace, AutoGen, and more. Operationalize ML: Build MLOps pipelines with robust CI/CD, monitoring, testing, and model evaluation — especially for LLM outputs. Drive Evaluation: Create frameworks to assess safety, performance, and quality of generative AI applications. Code & Lead: Write production-grade Python, contribute to core infrastructure, and raise the bar for technical excellence. Collaborate Strategically: Work with Product, AIML leadership, and cross-functional partners to align tech execution with company vision. Help us Grow the Team: Mentor and grow a team of AI engineers as we scale — help us build not just software, but a world-class engineering culture. Requirements 5+ years of experience in Software, Data, and/or ML engineering Expert in Python, with strong knowledge of data engineering tools, GCP (Vertex AI, Dataflow, BigQuery) , and AI pipelines Hands-on with CI/CD, model monitoring, and observability in ML systems. Experience launching production-grade GenAI systems, especially involving LLMs, agentic workflows, or multi-agent coordination. Familiarity with AI/LLM frameworks (Langchain, LlamaIndex, HuggingFace) and modern prompt engineering techniques. Bonus Points For Master's or PhD in CS, AI, ML, or related fields Experience with AWS, Azure, or other cloud AI stacks Experience with Graph databases (e.g., Neo4j) and proficiency in SQL and NoSQL databases Background in big data (Spark, Flink) or open-source AI contributions Understanding of responsible AI, ethics, and data governance Benefits This is a remote job. Work from anywhere! We’ve been thriving as a fully-remote team since 2014. To us, remote work means flexibility and having truly diverse, global teams. Additionally: Collaboration with a forward-thinking team where new ideas come to life, experience is valued, and talent is incubated. Competitive salary Career growth opportunities Flexible paid time off Laptop reimbursement Ongoing training and development opportunities About our recruitment process We don’t expect a perfect fit for every requirement we’ve outlined. If you can see yourself contributing to the team, we want to hear your story. You can expect up to 3 interviews. In some scenarios, we’re able to streamline the process to have minimal rounds. Director-level roles and above should expect a more thorough process, with multiple rounds of interviews. All roles require reference and background checks Teramind is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration without regard to race, age, religion, color, marital status, national origin, gender, gender identity or expression, sexual orientation, disability, or veteran status.
Posted 3 weeks ago
155.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Position Title Data Scientist II Function/Group R&D/Packaging Location Mumbai Shift Timing Regular Role Reports to Sr. Manager, Global Knowledge Solutions Remote/Hybrid/in-Office Hybrid About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Haagen-Dazs, we have been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate. us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC), Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI), Global Shared Services (GSS), Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview In partnership with our cross-functional partners, ITQ innovates and develops products that meet the ever-changing needs of our consumers and enables long-term business growth. We identify and develop technologies that shape and protect our businesses today and into the future. ITQ operates across three organizations: Global Applications, Capabilities COEs, and Shared Services & Operations For more details about General Mills please visit this Link Purpose of the role The Global Knowledge Services (GKS) organization catalyzes the creation, transfer, and application of knowledge to ensure ITQ succeeds at its mission of driving internal and external innovation, developing differentiated technology, and engendering trust through food safety and quality. The scientists in the Statistics and Analytics Program Area will collaborate with US and India GKS team members to deliver high value statistical work that advances ITQ initiatives in consumer product research, health and nutrition science, research and development, and quality improvement. The Data Scientist II in this program area will be responsible for: designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics, data science, and business intelligence across our organization leveraging GCP services. This role requires close collaboration with statisticians, data scientists, and BI developers to ensure timely, reliable, and quality data delivery that drives insights and decision-making. Key Accountabilities 70%of Time- Excellent Technical Work Design, develop, and optimize data pipelines and ETL/ELT workflows using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) Build and maintain data architecture that supports structured and unstructured data from multiple sources Work closely with statisticians and data scientists to provision clean, transformed datasets for advanced modeling and analytics Enable self-service BI through efficient data modeling and provisioning in tools like Looker, Power BI, or Tableau Implement data quality checks, monitoring, and documentation to ensure high data reliability and accuracy Collaborate with DevOps/Cloud teams to ensure data infrastructure is secure, scalable, and cost-effective Support and optimize workflows for data exploration, experimentation, and productization of models Participate in data governance efforts, including metadata management, data cataloging, and access controls 15%of Time- Client Consultation and Business Partnering Work effectively with clients to identify client needs and success criteria, and translate into clear project objectives, timelines, and plans. Be responsive and timely in sharing project updates, responding to client queries, and delivering on project commitments. Clearly communicate analysis, conclusions, insights, and conclusions to clients using written reports and real-time meetings. 10%of Time-Innovation, Continuous Improvement (CI), and Personal Development Learn and apply a CI mindset to work, seeking opportunities for improvements in efficiency and client value. Identify new resources, develop new methods, and seek external inspiration to drive innovations in our work processes. Continually build skills and knowledge in the fields of statistics, and the relevant sciences. 5% of Time-Administration Participate in all required training (Safety, HR, Finance, CI, other) and actively GKS, and ITQ meetings, events, and activities. Complete other administrative tasks as required. Minimum Qualifications Minimum Degree Requirements: Masters from an accredited university Minimum 6 years of related experience required Specific Job Experience Or Skills Needed 6+ years of experience in data engineering roles, including strong hands-on GCP experience Proficiency in GCP services like BigQuery, Cloud Storage, Cloud Composer (Airflow), Dataflow, Pub/Sub Strong SQL skills and experience working with large-scale data warehouses Solid programming skills in Python and/or Java/Scala Experience with data modeling, schema design, and performance tuning Familiarity with CI/CD, Git, and infrastructure-as-code principles (Terraform preferred) Strong communication and collaboration skills across cross-functional teams For Global Knowledge Services Ability to effectively work cross-functionally with internal/global team members. High self-motivation, with the ability to work both independently and in teams. Excels at driving projects to completion, with attention to detail. Ability to exercise judgment in handling confidential and proprietary information. Ability to effectively prioritize, multi-task, and execute tasks according to a plan. Able to work on multiple priorities and projects simultaneously. Demonstrated creative problem-solving abilities, attention to detail, ability to “think outside the box.” Preferred Qualifications Preferred Major Area of Study: Master’s degree in Computer Science, Engineering, Data Science, or a related field Preferred Professional Certifications: GCP Preferred 6 years of related experience Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next.
Posted 3 weeks ago
155.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Häagen-Dazs, we’ve been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in. We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the "Work with Heart" philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the following Link. Purpose of the role The Enterprise Data Development team is responsible for designing & architecting solutions to integrate & transform business data into the Data Warehouse to deliver a data layer for the Enterprise using cutting-edge cloud technologies like GCP. We design solutions to meet the expanding need for more and more internal/external information to be integrated with existing sources; research, implement, and leverage new technologies to deliver more actionable insights to the enterprise. We integrate solutions that combine process, technology landscapes and business information from the core enterprise data sources that form our corporate information factory to provide end-to-end solutions for the business. Key Accountabilities Create, code, and support a variety of GCP, ETL & SQL solutions Experience with agile techniques or methods Work effectively in a distributed global team environment. Works on pipelines of moderate scope & complexity Effective technical & business communication with good influencing skills Analyse existing processes and user development requirements to ensure maximum efficiency Participates in the implementation and deployment of emerging tools and processes in the GCP big data space Turn information into insight by consulting with architects, solution managers, and analysts to understand the business needs & deliver solutions Support existing Data warehouses & related jobs Proactive research into up-to-date technology or techniques for development Should have an automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes MINIMUM QUALIFICATION Demonstrate learning agility & inquisitiveness towards latest technology Seeks to learn new skills via experienced team members, documented processes, and formal training Ability to deliver projects with minimal supervision Delivers assigned work within the given parameters of time and quality Self-motivated team player and should have the ability to overcome challenges and achieve desired results 2-4+ years of relevant experience in GCP Data, Data warehouse Excellent communication skills- verbal and written Excellent analytical skills Excellent academics Bachelor’s Degree in Computer Science/Electronics/Electrical from a Tier 1 institute Intermediate level of experience with SQL, PL/SQL, Python Basic level of experience with Big Query, Composer, Dataflow, and Data Warehousing concepts Preferred Qualifications GCP Data Engineering Certification GCP certification Understanding of CPG industry Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next.
Posted 3 weeks ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Looking for Manager Data Engineering : 12+ years total experince in data engineering and analytics. 2 years of experience in GCP Cloud services such as Big Query, Airflow DAG, Dataflow etc. 2 years of experience in Data Extraction and creating data pipeline workflows on Bigdata(Hive, HQL/Pyspark) with knowledge of Data Engineering concepts. Exposure in analyzing large data sets from multiple data sources, perform validation of data Knowledge of Hadoop Eco-system components like HDFS, Spark, Hive, Sqoop. Experience writing code in Python. Knowledge of SQL/HQL functionalities to write the optimized queries Ability to build a migration plan in collaboration with stakeholders Analytical and problem-solving skills
Posted 3 weeks ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with G&A (Finance, HR, Legal, IT etc.) Business SMEs, Data Engineers, Data Scientists and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Data & AI Product Teams Become a G&A (Finance, HR, Legal, IT etc.) domain authority in Data & AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with Business SME’s, Data Scientists, ML Engineers to understand the requirements around Data product requirements, KPI’s etc. Analyzing the source systems and create the STTM documents. Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: 12 to 17 years of experience in G&A (Finance, HR, Legal, IT etc.) Information Systems Mandatory work experience in acting as a business analyst in DWH, Data product building, BI & Analytics Applications. Experience in Analyzing the requirements of BI, AI & Analytics applications and working with Data Source SME, Data Owners to identify the data sources and data flows Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 3 weeks ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with pharma / bio-technology operations (supply chain, manufacturing, quality) Business SMEs, Data Engineers, Data Scientists and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Data & AI Product Teams Become a pharma / bio-technology operations (supply chain, manufacturing, quality) domain authority in Data & AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with Business SME’s, Data Scientists, ML Engineers to understand the requirements around Data product requirements, KPI’s etc. Analyzing the source systems and create the STTM documents. Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: 12 to 17 years of experience in pharma / bio-technology operations (supply chain, manufacturing, quality) Information Systems Mandatory work experience in acting as a business analyst in DWH, Data product building, BI & Analytics Applications. Experience in Analyzing the requirements of BI, AI & Analytics applications and working with Data Source SME, Data Owners to identify the data sources and data flows Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 3 weeks ago
0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About the Role: We are seeking a highly skilled Senior Data Engineer with extensive experience in Google Cloud to join our dynamic team. In this role, you will be responsible for designing, implementing, and optimizing data pipelines and architectures to support our data-driven decision-making processes. You will collaborate with cross-functional teams to ensure the seamless integration and utilization of data across the organization. Key Responsibilities Design and develop scalable data pipelines and architectures using Google Cloud Platform (GCP) services. Create and maintain optimum data pipeline architecture for ingestion, storage, processing & transformation of data for building data products for analytics Implement data integration solutions to consolidate data from various sources. Build the optimal data extraction & transformation mechanisms for various kinds of data. Optimize data storage and retrieval processes for performance and cost-efficiency. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Conduct pro-active performance and fine tuning for early detection and rectification of slow response, application errors, tuning, and implement corrective and preventive actions. Monitor and troubleshoot data pipeline issues to ensure continuous data flow. Build data pipelines which are scalable, resilient and sustainable to address business requirements. Excellent problem-solving skills and attention to detail. Stay updated with the latest trends and best practices in data engineering and Google Cloud technologies. Be Part Of An Extraordinary Story Your skills. Your imagination. Your ambition. Here, there are no boundaries to your potential and the impact you can make. You’ll find infinite opportunities to grow and work on the biggest, most rewarding challenges that will build your skills and experience. You have the chance to be a part of our future, and build the life you want while being part of an international community. Our best is here and still to come. To us, impossible is only a challenge. Join us as we dare to achieve what’s never been done before. Together, everything is possible. Job Posting Jul 10, 2025, 1:31:05 PM Required Skills Proven experience as a Senior Data Engineer, with a focus on Google Cloud Platform (GCP). Strong proficiency in GCP services such as BigQuery, Dataproc, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and database management. Experience with ETL processes and tools. Proficiency in Python programming language. Knowledge of data modeling, data warehousing, and data lakes. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills Experience with machine learning and AI frameworks. Knowledge of containerization and orchestration tools. Experience with data visualization tools (Tableau, PowerBI, Looker). Understanding of data governance and compliance standards. Knowledge of airline domain Knowledge of agile/lean development methodologies About You The applicant should have a bachelor’s degree or equivalent (Degree in engineering, computer applications, commerce, or business administration). You must have minimum 8 years of data engineering experience.Should have excellent verbal and written communications skills. Also possess good analytical, interpersonal skills and a proven team player. About Qatar Airways Group Our story started with four aircraft. Today, we deliver excellence across 12 different businesses coming together as one. We’ve grown fast, broken records and set trends that others follow. We don’t slow down by the fear of failure. Instead, we dare to achieve what’s never been done before. So whether you’re creating a unique experience for our customers or innovating behind the scenes, every person contributes to our proud story. A story of spectacular growth and determination. Now is the time to bring your best ideas and passion to a place where your ambition will know no boundaries, and be part of a truly global community. How To Apply If you’re ready to join a progressive team and have a challenging and rewarding career, then apply now by uploading your CV and completing our quick application form.
Posted 3 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Role & Responsibilities Utilize Google Cloud Platform & Data Services to modernize legacy applications. Understand technical business requirements and define architecture solutions that align to Ford Motor & Credit Companies Patterns and Standards. Collaborate and work with global architecture teams to define analytics cloud platform strategy and build Cloud analytics solutions within enterprise data factory. Provide Architecture leadership in design & delivery of new Unified data platform on GCP. Understand complex data structures in analytics space as well as interfacing application systems. Develop and maintain conceptual, logical & physical data models. Design and guide Product teams on Subject Areas and Data Marts to deliver integrated data solutions. Leverage cloud AI/ML Platforms to deliver business and technical requirements. Provide architectural guidance for optimal solutions considering regional Regulatory needs. Provide architecture assessments on technical solutions and make recommendations that meet business needs and align with architectural governance and standard. Guide teams through the enterprise architecture processes and advise teams on cloud-based design, development, and data mesh architecture. Provide advisory and technical consulting across all initiatives including PoCs, product evaluations and recommendations, security, architecture assessments, integration considerations, etc. Responsibilities Required Skills and Selection Criteria: Google Professional Solution Architect certification. 8+ years of relevant work experience in analytics application and data architecture, with deep understanding of cloud hosting concepts and implementations. 5+ years’ experience in Data and Solution Architecture in analytics space. Solid knowledge of cloud data architecture, data modelling principles, and expertise in Data Modeling tools. Experience in migrating legacy analytics applications to Cloud platform and business adoption of these platforms to build insights and dashboards through deep knowledge of traditional and cloud Data Lake, Warehouse and Mart concepts. Good understanding of domain driven design and data mesh principles. Experience with designing, building, and deploying ML models to solve business challenges using Python/BQML/Vertex AI on GCP. Knowledge of enterprise frameworks and technologies. Strong in architecture design patterns, experience with secure interoperability standards and methods, architecture tolls and process. Deep understanding of traditional and cloud data warehouse environment, with hands on programming experience building data pipelines on cloud in a highly distributed and fault-tolerant manner. Experience using Dataflow, pub/sub, Kafka, Cloud run, cloud functions, Bigquery, Dataform, Dataplex , etc. Strong understanding on DevOps principles and practices, including continuous integration and deployment (CI/CD), automated testing & deployment pipelines. Good understanding of cloud security best practices and be familiar with different security tools and techniques like Identity and Access Management (IAM), Encryption, Network Security, etc. Strong understanding of microservices architecture. Qualifications Nice to Have Bachelor’s degree in Computer science/engineering, Data science or related field. Strong leadership, communication, interpersonal, organizing, and problem-solving skills Good presentation skills with ability to communicate architectural proposals to diverse audiences (user groups, stakeholders, and senior management). Experience in Banking and Financial Regulatory Reporting space. Ability to work on multiple projects in a fast paced & dynamic environment. Exposure to multiple, diverse technologies, platforms, and processing environments.
Posted 3 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsibilities Java 11 Spring Boot development and support Skill using Java 11 and above Java Developer with AMQ/MQTT/OpenShift Skill in REST API bases web application development on Redhat Additional Skill required include Microservices on OpenShift, GKE, Cloud Endpoints Skill in using Queue (AMQ, MQTT) Support application for BL, DL, Integration and Services using Java Development of all CURD dataflow and business logic Provide the deployment support & documentations. Should possess the overall knowledge of application and functionality. Fosters open communication within and between teams Support minor design, fixes of the applications working with front-end and back-end team Provide the technical guidance to team and lead on issue resolution. Qualifications: 7+ 6-9 years of experience in Web Application using Java Experience in building application using Java 11 and above on Redhat Experience in using of AMQ,MQTT and OpenShift (preferably 2 skills out of this) Experience in integration of front end and backend and services Good to Have Knowledge in Drools and SQL SERVER Strong analytical and business logic design capabilities Strong Team player skill Qualifications Familiarity with web/mobile application Support using Java 11 Stack and above. Degree in computer science or appropriate related field preferred
Posted 3 weeks ago
5.0 years
5 - 7 Lacs
Thiruvananthapuram
On-site
5 - 7 Years 1 Opening Trivandrum Role description Job Summary: We are seeking a Senior Data Engineer with strong hands-on experience in PySpark, Big Data technologies, and cloud platforms (preferably GCP) . The ideal candidate will design, implement, and optimize scalable data pipelines while driving technical excellence and process improvements. You will collaborate with cross-functional teams to solve complex data problems and ensure delivery of high-quality, cost-effective data solutions. Roles & Responsibilities: Design & Development: Develop scalable and efficient data pipelines using PySpark, Hive, SQL, Spark, and Hadoop . Translate high-level business requirements and design documents (HLD/LLD) into technical specifications and implementation. Create and maintain architecture and design documentation. Performance Optimization & Quality: Monitor, troubleshoot, and optimize data workflows for cost, performance, and reliability . Perform root cause analysis (RCA) on defects and implement mitigation strategies. Ensure adherence to coding standards , version control practices, and testing protocols. Collaboration & Stakeholder Engagement: Interface with product managers, data stewards, and customers to clarify requirements. Conduct technical presentations, design walkthroughs, and product demos. Provide timely updates, escalations, and support during UAT and production rollouts. Project & People Management: Manage delivery of data modules/user stories with a focus on timelines and quality . Set and review FAST goals for self and team; provide mentorship and technical guidance. Maintain team engagement and manage team member aspirations through regular feedback and career support. Compliance & Knowledge Management: Ensure compliance with mandatory trainings and engineering processes . Contribute to and consume project documentation, templates, checklists , and domain-specific knowledge . Review and approve reusable assets developed by the team. Must-Have Skills: 6+ years of experience in Data Engineering or related roles. Strong proficiency in PySpark , SQL , Spark , Hive , and the Hadoop ecosystem . Hands-on experience with Google Cloud Platform (GCP) or equivalent cloud services (e.g., AWS, Azure). Expertise in designing, building, testing, and deploying large-scale data processing systems. Sound understanding of data architecture , ETL frameworks , and batch/streaming data pipelines . Strong knowledge of Agile methodologies (Scrum/Kanban). Experience with code reviews , version control (Git), and CI/CD tools. Excellent communication skills – both verbal and written. Good-to-Have Skills: GCP Professional Data Engineer Certification or equivalent. Experience with Airflow, Dataflow, BigQuery , or similar GCP-native tools. Knowledge of data modeling techniques and data governance . Exposure to domain-specific projects (e.g., BFSI, Healthcare, Retail). Experience with Docker, Kubernetes , or other containerization tools. Working knowledge of test automation and performance testing frameworks. Skills Spark,Hadoop,Hive,Gcp About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 3 weeks ago
8.0 years
3 - 8 Lacs
Ahmedabad
On-site
Job title Senior Data Engineer (Google Cloud) Ref # 250000EW Location India - Ahmedabad Job family Corporate & Commercial Closing date: 24-Jul-2025 About the Role: We are seeking a highly skilled Senior Data Engineer with extensive experience in Google Cloud to join our dynamic team. In this role, you will be responsible for designing, implementing, and optimizing data pipelines and architectures to support our data-driven decision-making processes. You will collaborate with cross-functional teams to ensure the seamless integration and utilization of data across the organization. Key Responsibilities: Design and develop scalable data pipelines and architectures using Google Cloud Platform (GCP) services. Create and maintain optimum data pipeline architecture for ingestion, storage, processing & transformation of data for building data products for analytics Implement data integration solutions to consolidate data from various sources. Build the optimal data extraction & transformation mechanisms for various kinds of data. Optimize data storage and retrieval processes for performance and cost-efficiency. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Conduct pro-active performance and fine tuning for early detection and rectification of slow response, application errors, tuning, and implement corrective and preventive actions. Monitor and troubleshoot data pipeline issues to ensure continuous data flow. Build data pipelines which are scalable, resilient and sustainable to address business requirements. Excellent problem-solving skills and attention to detail. Stay updated with the latest trends and best practices in data engineering and Google Cloud technologies. Be part of an extraordinary story: Your skills. Your imagination. Your ambition. Here, there are no boundaries to your potential and the impact you can make. You’ll find infinite opportunities to grow and work on the biggest, most rewarding challenges that will build your skills and experience. You have the chance to be a part of our future, and build the life you want while being part of an international community. Our best is here and still to come. To us, impossible is only a challenge. Join us as we dare to achieve what’s never been done before. Together, everything is possible. Qualifications Required Skills: Proven experience as a Senior Data Engineer, with a focus on Google Cloud Platform (GCP). Strong proficiency in GCP services such as BigQuery, Dataproc, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and database management. Experience with ETL processes and tools. Proficiency in Python programming language. Knowledge of data modeling, data warehousing, and data lakes. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills: Experience with machine learning and AI frameworks. Knowledge of containerization and orchestration tools. Experience with data visualization tools (Tableau, PowerBI, Looker). Understanding of data governance and compliance standards. Knowledge of airline domain Knowledge of agile/lean development methodologies About you: The applicant should have a bachelor’s degree or equivalent (Degree in engineering, computer applications, commerce, or business administration). You must have minimum 8 years of data engineering experience. Should have excellent verbal and written communications skills. Also possess good analytical, interpersonal skills and a proven team player. About Qatar Airways Group: Our story started with four aircraft. Today, we deliver excellence across 12 different businesses coming together as one. We’ve grown fast, broken records and set trends that others follow. We don’t slow down by the fear of failure. Instead, we dare to achieve what’s never been done before. So whether you’re creating a unique experience for our customers or innovating behind the scenes, every person contributes to our proud story. A story of spectacular growth and determination. Now is the time to bring your best ideas and passion to a place where your ambition will know no boundaries, and be part of a truly global community. How to apply: If you’re ready to join a progressive team and have a challenging and rewarding career, then apply now by uploading your CV and completing our quick application form. About Qatar Airways Group Our story started with four aircraft. Today, we deliver excellence across 12 different businesses coming together as one. We’ve grown fast, broken records and set trends that others follow. We don’t slow down by the fear of failure. Instead, we dare to achieve what’s never been done before. So whether you’re creating a unique experience for our customers or innovating behind the scenes, every person contributes to our proud story. A story of spectacular growth and determination. Now is the time to bring your best ideas and passion to a place where your ambition will know no boundaries, and be part of a truly global community. https://aa115.taleo.net/careersection/QA_External_CS/jobapply.ftl?lang=en&job=250000EW
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough