Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 years
0 Lacs
Hyderabad, Telangana
On-site
About the Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Software Engineer The Team: The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering and distribution to power our Financial Institution Group business and customer needs. Focus on platform scale to support business by following a common data lifecycle that accelerates business value. We provide essential intelligence for Financial Services, Real Estate and Insurance industries The Impact: The FIG Data Engineering team will be responsible for implementing & maintaining services and/or tools to support existing feed systems which allows users to consume FIG datasets and make FIG data available to a data fabric for wider consumption and processing within the company. What’s in it for you: Ability to work with global stakeholders and working on latest tools and technologies. Responsibilities: Build new data acquisition and transformation pipelines using big data and cloud technologies. Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative. What We’re Looking For: Bachelor’s in computer science or equivalent with at least 8+ years of professional software work experience Experience with Big Data platforms such as Apache Spark, Apache Airflow, Google Cloud Platform, Apache Hadoop. Deep understanding of REST, good API design, and OOP principles Experience with object-oriented/object function scripting languages : Python , C#, Scala, etc. Good working knowledge of relational SQL and NoSQL databases Experience in maintaining and developing software in production utilizing cloud-based tooling ( AWS, Docker & Kubernetes, Okta) Strong collaboration and teamwork skills with excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced software development environment Agile experience highly desirable Experience in Snowflake, Databricks will be a big plus . Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316183 Posted On: 2025-06-15 Location: Hyderabad, Telangana, India
Posted 4 days ago
0.0 years
0 Lacs
Hyderabad, Telangana
On-site
Lead, Application Development Hyderabad, India Information Technology 316183 Job Description About The Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Software Engineer The Team: The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering and distribution to power our Financial Institution Group business and customer needs. Focus on platform scale to support business by following a common data lifecycle that accelerates business value. We provide essential intelligence for Financial Services, Real Estate and Insurance industries The Impact: The FIG Data Engineering team will be responsible for implementing & maintaining services and/or tools to support existing feed systems which allows users to consume FIG datasets and make FIG data available to a data fabric for wider consumption and processing within the company. What’s in it for you: Ability to work with global stakeholders and working on latest tools and technologies. Responsibilities: Build new data acquisition and transformation pipelines using big data and cloud technologies. Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative. What We’re Looking For: Bachelor’s in computer science or equivalent with at least 8+ years of professional software work experience Experience with Big Data platforms such as Apache Spark, Apache Airflow, Google Cloud Platform, Apache Hadoop. Deep understanding of REST, good API design, and OOP principles Experience with object-oriented/object function scripting languages : Python , C#, Scala, etc. Good working knowledge of relational SQL and NoSQL databases Experience in maintaining and developing software in production utilizing cloud-based tooling ( AWS, Docker & Kubernetes, Okta) Strong collaboration and teamwork skills with excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced software development environment Agile experience highly desirable Experience in Snowflake, Databricks will be a big plus. Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316183 Posted On: 2025-06-15 Location: Hyderabad, Telangana, India
Posted 4 days ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About The Role Grade Level (for internal use): 10 Position Summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of ‘big data’ from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who You Are 6+ years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python/ PySpark etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers – GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres and MongoDB. Experience workflow management tools: Airflow, AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Grade: 10 Location: Gurugram Hybrid Model: twice a week work from office Shift Time: 12 pm to 9 pm IST What You'll Love About Us – Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Prepare for the Future. Academy – we are all learners; we are all teachers! Employee Assistance Program. Confidential and Professional Counselling and Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! About AutomotiveMastermind Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. We’re an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of “Drive” and “Help” have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What We Do Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
India
On-site
Working hours: Mon through Fri, 8hours/day, 40 hours/week, US Business hours (Central US time zone) *** YOU ARE REQUIRED TO WORK IN US BUSINESS HOURS*** ***YOU MUST UPLOAD YOUR RESUME IN MICROSOFT WORD*** We’re looking for a Lead DBT Engineer with deep expertise in DBT , Python , and Snowflake to help architect, build, and optimize our modern data stack. This is a hands-on leadership role where you’ll shape our data transformation layer using DBT, mentor engineers, and drive best practices across the data engineering team. Key Responsibilities Lead the design and implementation of scalable data pipelines using DBT and Snowflake Own and maintain the DBT project structure, models, and documentation Write production-grade Python code for custom transformations, orchestration, and data quality checks Collaborate with analytics, product, and engineering teams to translate business needs into well-modeled datasets Implement and enforce CI/CD , testing, and deployment practices within the DBT workflow Monitor data pipelines for quality, performance, and reliability Serve as a technical mentor for junior and mid-level engineers Required Skills & Experience 6+ years of experience in data engineering with at least 2 years in a lead role Advanced expertise in DBT (Data Build Tool) — including Jinja, macros, snapshots, and tests Proficient in Python for data processing, scripting, and automation Strong experience with Snowflake (warehousing, performance tuning, and SQL optimization) Solid understanding of data modeling (dimensional/star/snowflake schemas) Experience working with modern data stacks (Airflow, Fivetran, Looker, etc. is a plus) Strong grasp of software engineering practices : version control, unit testing, and CI/CD pipelines Excellent communication skills and ability to lead cross-functional data initiatives Preferred Qualifications Experience building or scaling a DBT implementation from scratch Familiarity with orchestration tools (Airflow, Dagster, Prefect) Prior experience in a high-growth tech or SaaS environment Exposure to cloud infrastructure (AWS, GCP, or Azure) Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
I am thrilled to share an exciting opportunity with one of our esteemed clients! 🚀 Join me in exploring new horizons and unlocking potential. If you're ready for a challenge and growth,. Exp: 7+yrs Location: Chennai, Hyderabad Immediate joiner only, WFO Mandatory skills: SQL, Python, Pyspark, Databricks (strong in core databricks), AWS (AWS is mandate) JD: Manage and optimize cloud infrastructure (AWS, Databricks) for data storage, processing, and compute resources, ensuring seamless data operations. Implement data quality checks, validation rules, and transformation logic to ensure the accuracy, consistency, and reliability of data. Integrate data from multiple sources, ensuring data is accurately transformed and stored in optimal formats (e.g., Delta Lake, Redshift, S3). Automate data workflows using tools like Airflow, Databricks APIs, and other orchestration technologies to streamline data ingestion, processing, and reporting tasks. Regards R Usha usha@livecjobs.com Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Want to be on a team that full of results-driven individuals who are constantly seeking to innovate? Want to make an impact? At SailPoint, our Engineering team does just that. Our engineering is where high-quality professional engineering meets individual impact. Our team creates products are built on a mature, cloud-native event-driven microservices architecture hosted in AWS. SailPoint is seeking a Backend Software Engineer to help build a new cloud-based SaaS identity analytics product. We are looking for well-rounded backend or full stack engineers who are passionate about building and delivering reliable, scalable microservices and infrastructure for SaaS products. As one of the first members on the team, you will be integral in building this product and will be part of an agile team that is in startup mode. This is a unique opportunity to build something from scratch but have the backing of an organization that has the muscle to take it to market quickly, with a very satisfied customer base. Responsibilities Deliver efficient, maintainable data pipelines Deliver robust, bug free code Java based micro services Build and maintain Data Analytics and Machine Learning features Produce designs and rough estimates, and implement features based on product requirements. Collaborate with peers on designs, code reviews, and testing. Produce unit and end-to-end tests to improve code quality and maximize code coverage for new and existing features. Responsible for on-call production support Requirements 4+ years of professional software development experience Strong Python, SQL, Java experience Great communication skills BS in Computer Science, or a related field Comprehensive experience with object-oriented analysis and design skills Experience with Workflow engines Experience with Continuous Delivery, Source control Experience with Observability platforms for performance metrics collection and monitoring. Preferred Strong Experience in AirFlow, Snowflake, DBT Experience with ML Pipelines (SageMaker) Experience with Continuous Delivery Experience working on a Big Data/Machine Learning product Compensation and benefits Experience a Small-company Atmosphere with Big-company Benefits. Recharge your batteries with a flexible vacation policy and paid holidays. Grow with us with both technical and career growth opportunities. Enjoy a healthy work-life balance with flexible hours, family-friendly company events and charitable work. SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply to join our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other category protected by applicable law. Alternative methods of applying for employment are available to individuals unable to submit an application through this site because of a disability. Contact hr@sailpoint.com or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
SailPoint is the leader in identity security for the cloud enterprise. Our identity security solutions secure and enable thousands of companies worldwide, giving our customers unmatched visibility into the entirety of their digital workforce, ensuring workers have the right access to do their job – no more, no less. Built on a foundation of AI and ML, our Identity Security Cloud Platform delivers the right level of access to the right identities and resources at the right time—matching the scale, velocity, and changing needs of today’s cloud-oriented, modern enterprise. About the role: Want to be on a team that full of results-driven individuals who are constantly seeking to innovate? Want to make an impact? At SailPoint, our Data Platform team does just that. SailPoint is seeking a Senior Data/Software Engineer to help build robust data ingestion and processing system to power our data platform. We are looking for well-rounded engineers who are passionate about building and delivering reliable, scalable data pipelines. This is a unique opportunity to build something from scratch but have the backing of an organization that has the muscle to take it to market quickly, with a very satisfied customer base. Responsibilities : Spearhead the design and implementation of ELT processes, especially focused on extracting data from and loading data into various endpoints, including RDBMS, NoSQL databases and data-warehouses. Develop and maintain scalable data pipelines for both stream and batch processing leveraging JVM based languages and frameworks. Collaborate with cross-functional teams to understand diverse data sources and environment contexts, ensuring seamless integration into our data ecosystem. Utilize AWS service-stack wherever possible to implement lean design solutions for data storage, data integration and data streaming problems. Develop and maintain workflow orchestration using tools like Apache Airflow. Stay abreast of emerging technologies in the data engineering space, proactively incorporating them into our ETL processes. Thrive in an environment with ambiguity, demonstrating adaptability and problem-solving skills. Qualifications : BS in computer science or a related field. 5+ years of experience in data engineering or related field. Demonstrated system-design experience orchestrating ELT processes targeting data Must be willing to work 4 overlapping hours with US timezone. will work closely with US based managers and engineers Hands-on experience with at least one streaming or batch processing framework, such as Flink or Spark. Hands-on experience with containerization platforms such as Docker and container orchestration tools like Kubernetes. Proficiency in AWS service stack. Experience with DBT, Kafka, Jenkins and Snowflake. Experience leveraging tools such as Kustomize, Helm and Terraform for implementing infrastructure as code. Strong interest in staying ahead of new technologies in the data engineering space. Comfortable working in ambiguous team-situations, showcasing adaptability and drive in solving novel problems in the data-engineering space. Preferred Experience with AWS Experience with Continuous Delivery Experience instrumenting code for gathering production performance metrics Experience in working with a Data Catalog tool ( Ex: Atlan / Alation ) What success looks like in the role Within the first 30 days you will: Onboard into your new role, get familiar with our product offering and technology, proactively meet peers and stakeholders, set up your test and development environment. Seek to deeply understand business problems or common engineering challenges and propose software architecture designs to solve them elegantly by abstracting useful common patterns. By 90 days: Proactively collaborate on, discuss, debate and refine ideas, problem statements, and software designs with different (sometimes many) stakeholders, architects and members of your team. Take a committed approach to prototyping and co-implementing systems alongside less experienced engineers on your team—there’s no room for ivory towers here. By 6 months: Collaborates with Product Management and Engineering Lead to estimate and deliver small to medium complexity features more independently. Occasionally serve as a debugging and implementation expert during escalations of systems issues that have evaded the ability of less experienced engineers to solve in a timely manner. Share support of critical team systems by participating in calls with customers, learning the characteristics of currently running systems, and participating in improvements. SailPoint is an equal opportunity employer and we welcome everyone to our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TATA Consultancy Services Job Openings at TCS Skill :Python DEVELOPER + Django + FastAPI Exp range :5 - 12 years Role : Permanent Role Job location :HYDERABAD Current location : HYDERABAD Mode of interview :Walkin Face to face only at Hyderabad on 21st Jun 25 Pls find the Job Description below. SN Required Information Details 1 Role** Python Developer + + Django + FastAPI 2 Required Technical Skill Set** Airflow, Kubernetes, Python, SQL, + Django + FastAPI 3 No of Requirements** 2 4 Desired Experience Range** 4+Years 5 Location of Requirement Bangalore Desired Competencies (Technical/Behavioral Competency) Must-Have** (Ideally should not be more than 3-5) Design and develop scalable pipelines with + Django + FastAPI Hands on experience with Kubernetes and Helm charts with AirflowExperience with cloud platforms integration of Airflow with AWS Thanks & Regards Priyanka Talent Acquisition Group Tata Consultancy Services Show more Show less
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Are you looking for a career move that will put you at the heart of a global financial institution? Then bring your skills in data-driven modelling and data engineering to Citi’s Global FX Team. By Joining Citi, you will become part of a global organization whose mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. Team/Role Overview The FX Data Analytics & AI Technology team, within Citi's FX Technology organization, seeks a highly motivated Full Stack Data Scientist / Data Engineer. The FX Data Analytics & Gen AI Technology team provides data, analytics, and tools to Citi FX sales and trading globally and is responsible for defining and executing the overall data strategy for FX. The successful candidate will be responsible for developing and implementing data-driven models, and engineering robust data and analytics pipelines, to unlock actionable insights from our vast amount of global FX data. The role will be instrumental in executing the overall data strategy for FX and will benefit from close interaction with a wide range of stakeholders across sales, trading, and technology. We are looking for a proactive individual with a practical and pragmatic attitude, ability to build consensus, and work both collaboratively and independently in a dynamic environment. What You’ll Do Design, develop and implement quantitative models to derive insights from large and complex FX datasets, with a focus on understanding market trends and client behavior, identifying revenue opportunities, and optimizing the FX business. Engineer data and analytics pipelines using modern, cloud-native technologies and CI/CD workflows, focusing on consolidation, automation, and scalability. Collaborate with stakeholders across sales and trading to understand data needs, translate them into impactful data-driven solutions, and deliver these in partnership with technology. Develop and integrate functionality to ensure adherence with best-practices in terms of data management, need-to-know (NTK), and data governance. Contribute to shaping and executing the overall data strategy for FX in collaboration with the existing team and senior stakeholders. What We’ll Need From You 8 to 12 Years experience Master’s degree or above (or equivalent education) in a quantitative discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Excellent Python programming skills, including experience with relevant analytical and machine learning libraries (e.g., pandas, polars, numpy, sklearn, TensorFlow/Keras, PyTorch, etc.), in addition to visualization and API libraries (matplotlib, plotly, streamlit, Flask, etc). Experience developing and implementing Gen AI applications from data in a financial context. Proficiency working with version control systems such as Git, and familiarity with Linux computing environments. Experience working with different database and messaging technologies such as SQL, KDB, MongoDB, Kafka, etc. Familiarity with data visualization and ideally development of analytical dashboards using Python and BI tools. Excellent communication skills, both written and verbal, with the ability to convey complex information clearly and concisely to technical and non-technical audiences. Ideally, some experience working with CI/CD pipelines and containerization technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. At least two years (Over all 10+ hands on Data Engineering experience) of experience building and leading highly complex, technical data engineering teams. Lead data engineering team, from sourcing to closing. Drive strategic vision for the team and product Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Experience managing an data focused product,ML platform Hands on experience relevant experience in design, develop, and optimize scalable distributed data processing pipelines using Apache Spark and Scala. Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Experience managing, hiring and coaching software engineering teams. Experience with large-scale distributed web services and the processes around testing, monitoring, and SLAs to ensure high product quality. Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Required Skills: Experience: 7 to 10+ years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems. Proficiency in Functional Programming: High proficiency in Scala-based functional programming for developing robust and efficient data processing pipelines. Proficiency in Big Data Technologies: Strong experience with Apache Spark, Hadoop ecosystem tools such as Hive, HDFS, and YARN. AIRFLOW, DataOps, Data Management Programming and Scripting: Advanced knowledge of Scala and a good understanding of Python for data engineering tasks. Data Modeling and ETL Processes: Solid understanding of data modeling principles and ETL processes in big data environments. Analytical and Problem-Solving Skills: Strong ability to analyze and solve performance issues in Spark jobs and distributed systems. Version Control and CI/CD: Familiarity with Git, Jenkins, and other CI/CD tools for automating the deployment of big data applications. Desirable Experience: Real-Time Data Streaming: Experience with streaming platforms such as Apache Kafka or Spark Streaming.Python Data Engineering Experience is Plus Financial Services Context: Familiarity with financial data processing, ensuring scalability, security, and compliance requirements. Leadership in Data Engineering: Proven ability to work collaboratively with teams to develop robust data pipelines and architectures. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 4 days ago
6.0 - 11.0 years
17 - 30 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of GCP Sr Data Engineer We are seeking a highly experienced and visionary Senior Google Cloud Data Engineer to spearhead the design, development, and optimization of our data infrastructure and pipelines on the Google Cloud Platform (GCP). With over 10 years of hands-on experience in data engineering, you will be instrumental in building scalable, reliable, and performant data solutions that power our advanced analytics, machine learning initiatives, and real-time reporting. You will provide technical leadership, mentor team members, and champion best practices for data engineering within a GCP environment. Responsibilities Architect, design, and implement end-to-end data pipelines on GCP using services like Dataflow, Cloud Composer (Airflow), Pub/Sub, and BigQuery. Build and optimize data warehousing solutions leveraging BigQuery's capabilities for large-scale data analysis. Design and implement data lakes on Google Cloud Storage, ensuring efficient data organization and accessibility. Develop and maintain scalable ETL/ELT processes to ingest, transform, and load data from diverse sources into GCP. Implement robust data quality checks, monitoring, and alerting mechanisms within the GCP data ecosystem. Collaborate closely with data scientists, analysts, and business stakeholders to understand their data requirements and deliver high-impact solutions on GCP. Lead the evaluation and adoption of new GCP data engineering services and technologies. Implement and enforce data governance policies, security best practices, and compliance requirements within the Google Cloud environment. Provide technical guidance and mentorship to other data engineers on the team, promoting knowledge sharing and skill development within the GCP context. Troubleshoot and resolve complex data-related issues within the GCP infrastructure. Contribute to the development of data engineering standards, best practices, and comprehensive documentation specific to GCP. Qualifications we seek in you! Minimum Qualifications / Skills • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 10+ years of progressive experience in data engineering roles, with a strong focus on cloud technologies. • Deep and demonstrable expertise with the Google Cloud Platform (GCP) and its core data engineering services (e.g., BigQuery, Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, Cloud Functions). • Extensive experience designing, building, and managing large-scale data pipelines and ETL/ELT workflows specifically on GCP. • Strong proficiency in SQL and at least one programming language relevant to data engineering on GCP (e.g., Python). • Comprehensive understanding of data warehousing concepts, data modeling techniques optimized for BigQuery, and NoSQL database options on GCP (e.g., Cloud Bigtable, Firestore). • Solid grasp of data governance principles, data security best practices within GCP (IAM, KMS), and compliance frameworks. • Excellent problem-solving, analytical, and debugging skills within a cloud environment. • Exceptional communication, collaboration, and presentation skills, with the ability to articulate technical concepts clearly to various audiences. Preferred Qualifications/ Skills Google Cloud certifications relevant to data engineering (e.g., Professional Data Engineer). Experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Familiarity with data streaming technologies on GCP (e.g., Dataflow, Pub/Sub). Experience with machine learning workflows and MLOps on GCP (e.g., Vertex AI). Knowledge of containerization technologies (Docker, Kubernetes) and their application within GCP data pipelines (e.g., Dataflow FlexRS). Experience with data visualization tools that integrate well with GCP (e.g., Looker). Familiarity with data cataloging and data lineage tools on GCP (e.g., Data Catalog). Experience in [mention specific industry or domain relevant to your company]. Proven experience in leading technical teams and mentoring junior engineers in a GCP environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 5 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: AI Engineer Experience: 6-12 Years Location: Bangalore, Gurugram (Work From Office – 5 Days) Employment Type: Full-Time Interested Candidates Apply Here :- https://forms.gle/h3pXgxc57kUB3UJX6 Job Description :We are looking for a seasoned AI Enginee r with experienc e in designing and deploying AI/ML solutions, with a strong focus on Generative AI (GenAI ) and Large Language Models (LLMs ). The ideal candidate should have deep expertise in Pytho n, machine learnin g, and artificial intelligenc e, along with a strong understanding of real-world AI productization .As part of our advanced tech team in Bangalor e, you will play a critical role in building and integrating AI-powered applications that solve real business problems .Key Responsibilities :Design, develop, and deploy AI/ML solutions with a focus on Generative A I and LLM s (e.g., GPT, BERT, Claude, etc. )Fine-tune and customize foundation models for specific domain or business requirement sDevelop intelligent systems using Pytho n and modern ML framework sCollaborate with cross-functional teams including product, data engineering, and design to integrate AI into core product sEvaluate, test, and implement third-party APIs, libraries, and models relevant to AI/ML/LLM task sOptimize model performance, latency, and scalability for production us eStay up to date with the latest advancements in AI, machine learning, and GenAI technologie sRequired Skills :9+ years of professional experience in AI/M L role sStrong expertise in Generative A I, LLM s, and natural language processing (NLP )Proficient in Pytho n and libraries like TensorFlow, PyTorch, Hugging Face Transformer sSolid background in machine learning algorithm s, deep learning, and model evaluation technique sExperience deploying and maintaining ML models in productio nStrong problem-solving skills and ability to handle ambiguity in requirement sGood to Have :Experience with prompt engineering and few-shot/fine-tuning technique sKnowledge of MLOps practices and tools (e.g., MLflow, Kubeflow, Airflow )Familiarity with vector databases and retrieval-augmented generation (RAG) pipeline sCloud experience (AWS, GCP, or Azure ) Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
India
On-site
About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Position Overview As a Sr. Data Engineer at Oportun, you will be a key member of our team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross-functional and multi-month-long projects). Responsibilities Data Architecture and Design: Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements. Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures. Data Pipeline Development And Optimization Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data. Optimize data pipelines for performance, reliability, and scalability. Database Management And Optimization Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security. Implement and manage ETL processes for efficient data loading and retrieval. Data Quality And Governance Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations. Drive initiatives to improve data quality and documentation of data assets. Mentorship And Leadership Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth. Lead and participate in code reviews, ensuring best practices and high-quality code. Collaboration And Stakeholder Management Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs. Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value. Performance Monitoring And Optimization Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability. Common Requirements You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective solutions. You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility. You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team. You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team. You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems. You set the benchmark for responsiveness and ownership and overall accountability of engineering systems. You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. Proficiency in programming languages like Python/PySpark and Java or Scala Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MariaDB, NoSQL databases). Experience and expertise in building complex end-to-end data pipelines. Experience with orchestration and designing job schedules using the CICD tools like Jenkins, Airflow or Databricks Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) Ability to mentor junior team members. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). Strong leadership, problem-solving, and decision-making skills. Excellent communication and collaboration abilities. Familiarity or certification in Databricks is a plus. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate. California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/. We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3). Show more Show less
Posted 5 days ago
8.0 years
8 - 9 Lacs
Gurgaon
On-site
You Lead the Way. We've Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you'll learn and grow as we help you create a career journey that's unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you'll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company's success. Together, we'll win as a team, striving to uphold our company values and powerful backing promise to provide the world's best customer experience every day. And we'll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. American Express has embarked on an exciting transformation driven by an energetic new team of high performers. This is a great opportunity to join the Customer Marketing organization within American Express Technologies and become a driver of this exciting journey. We are looking for a highly skilled and experienced Senior Engineer with a history of building Bigdata, GCP Cloud, Python and Spark applications. The Senior Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization's data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Joining the Enterprise Marketing team, this role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. We pride ourselves on a culture of kindness and positivity, and a continuous focus on supporting colleague development to help you achieve your career goals. We lead with integrity, and we emphasize work/life balance for all of our teammates. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here's just some of what you'll be doing: As a part of our team, you will be developing innovative, high quality, and robust operational engineering capabilities. Develop software in our technology stack which is constantly evolving but currently includes Big data, Spark, Python, Scala, GCP, Adobe Suit ( like Customer Journey Analytics ). Work with Business partners and stakeholders to understand functional requirements, architecture dependencies, and business capability roadmaps. Create technical solution designs to meet business requirements. Define best practices to be followed by team. Taking your place as a core member of an Agile team driving the latest development practices Identify and drive reengineering opportunities, and opportunities for adopting new technologies and methods. Suggest and recommend solution architecture to resolve business problems. Perform peer code review and participate in technical discussions with the team on the best solutions possible. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers' digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. American Express offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology of #TeamAmex. Minimum Qualifications : BS or MS degree in computer science, computer engineering, or other technical discipline, or equivalent work experience. 8+ years of hands-on software development experience with Big Data & Analytics solutions – Hadoop Hive, Spark, Scala, Hive, Python, shell scripting, GCP Cloud Big query, Big Table, Airflow. Working knowledge of Adobe suit like Adobe Experience Platform, Adobe Customer Journey Analytics Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability. Design and development experience with Kafka, Real time ETL pipeline, API is desirable. Experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies. Certifications in cloud platform (GCP Professional Data Engineer) is a plus. Understanding of distributed (multi-tiered) systems, data structures, algorithms & Design Patterns. Strong Object-Oriented Programming skills and design patterns. Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git, Maven). Good knowledge and experience with configuration management tools like GitHub Ability to analyze complex data engineering problems, propose effective solutions, and implement them effectively. Looks proactively beyond the obvious for continuous improvement opportunities. Communicates effectively with product and cross functional team. Willingness to learn new technologies and leverage them to their optimal potential. Understanding of various SDLC methodologies, familiarity with Agile & scrum ceremonies. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? This role will be part of the Regulatory Reporting team , we are currently modernizing our platform , migrating it to GCP. You will contribute towards making the platform more resilient and secure for future regulatory requirements and ensuring compliance and adherence to Federal Regulations. Minimum Qualifications: - 5-8 years of overall technology experience - Strong expertise with handling large volumes of data coming from many different disparate systems - Strong expertise with Python and Py Spark - Working knowledge of Apache Spark , Airflow, GCP BQ and Data Proc open source data processing platforms - Working knowledge of databases and performance tuning for complex big data scenarios - Oracle DB and In Memory Processing - Cloud Deployments , CI/CD and Platform Resiliency - Strong experience with SRE practices , GIT Hub Automation , best practices around code coverage and documentation automation - Good experience with Mvel - Excellent communication skills , collaboration mindset and ability to work through unknowns Preferred Qualifications: - Understanding of Regulatory and Compliance Reports preferred - Experience with React, Node JS - Experience with GCP - Big Query and Data Flow , Data Migration to Bug Query and usage of CloudSQL We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 5 days ago
4.0 - 8.0 years
5 - 9 Lacs
Pune
On-site
About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VOIS India supports global markets and group functions of Vodafone and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Must have technical / professional qualifications: Primary Skills: Teradata SQL & ETL, Linux/Unix - Shell Scripting Alternate Skills : GCP (BiqQuery,DataForm, DataProc) Python, git or similar versioning tools, Airflow or similar scheduling tools Good experiences in: Minimum experience of 4-8 years in data engineering/data warehousing/ETL engineering Good knowledge of SQL Good knowledge of Python scripting Good understanding of cloud native platforms (GCP) Telecommunication experience Core competencies, knowledge and experience : Essential: Strong Data Warehouse development experience in Cloud Native Technologies (GCP preferred) Strong SQL experience - Advanced level of SQL scripting Expert in Python (at least 12 months in real time projects) Excellent data interpretation skills Good knowledge of data warehouse and business intelligence, good understanding of a wide range of data manipulation and analysis techniques Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel and should be able to interact with customer team and share ideas. Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. Hands on experience in working with large datasets. Able to manage different stakeholders. Experience Experience: Exceptional data manipulation and analysis techniques; comfortable using very large (>10’s millions of rows) datasets, containing both structured and unstructured data. Designing and implementing changes to the existing data model Develop & maintain relational staging areas of Application Layer Supporting operations team on data quality, data consistency issues and essential business critical processes. Drive system optimization and simplification. VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!
Posted 5 days ago
7.0 years
2 - 5 Lacs
Ahmedabad
On-site
Unlock Your Potential With IGNEK Welcome to IGNEK, where we combine innovation and passion! We want our workplace to help you grow professionally and appreciate the special things each person brings. Come with us as we use advanced technology to make a positive difference. At IGNEK, we know our success comes from our team’s talent and hard work. Celebrate Successes Harness Your Skills Experience Growth Together Work, Learn, Celebrate Appreciate Unique Contributions Get Started Culture & Values Our Culture & values guide our actions and define our principles. Growth Learn and grow with us. We’re committed to providing opportunities for you to excel and expand your horizons. Transparency We are very transparent in terms of work, culture and communication to build trust and strong bonding among employees, teams and managers. People First Our success is all about our people. We care about your well-being and value diversity in our inclusive workplace. Be a team Team Work is our strength. Embrace a “Be a Team” mindset, valuing collective success over individual triumphs. Together, we can overcome challenges and reach new heights. Perks & Benefits Competitive flexibility and comprehensive benefits prioritize your well-being. Creative programs, professional development, and a vibrant work-life balance ensure your success is our success. 5 Days Working Festival Celebration Rewards & Benefits Certification Program Skills Improvement Referral Program Friendly Work Culture Training & Development Enterprise Projects Leave Carry Forward Yearly Trip Hybrid Work Fun Activities Indoor | Outdoor Flexible Timing Reliable Growth Team Lunch Stay Happy Opportunity Work Life balance What Makes You Different? BE Authentic Stay true to yourself, it’s what sets you apart BE Proactive Take charge of your work, don’t wait for things to happen BE A Learner Keep an open mind and never stop seeking knowledge BE Professional Approach every task with diligence and integrity BE Innovative Think outside the box and push boundaries BE Passionate Let your enthusiasm light the path to success Python Backend Developer Technology: Python Job Type: Full Time Job Location: Ahmedabad Experience: 7+ Years Location: Ahmedabad (On-site) Overview: We are seeking a highly skilled Python Backend Developer with a strong foundation in building scalable, cloud-native microservices and business logic-driven systems. You will play a key role in delivering backend solutions on AWS, leveraging modern development tools and practices to build robust, enterprise-grade services. Key Responsibilities: Design, develop, and maintain scalable RESTful APIs and microservices using Python. Lead the end-to-end implementation of backend systems on AWS Cloud. Modernize and migrate legacy systems to cloud-native architectures. Integrate with relational databases (Postgres RDS, Oracle) and graph databases. Collaborate with tech leads and stakeholders to translate requirements into scalable backend solutions. Conduct unit, integration, and functional testing to ensure high reliability and performance. Follow SDLC best practices, and deploy code using CI/CD automation pipelines. Use orchestration tools like Apache Airflow to streamline backend workflows. Ensure solutions meet security, performance, and scalability standards. Stay current with emerging technologies and best practices in backend and cloud development. Required Skills & Experience: 7+ years of backend development experience with a strong focus on Python. Proficient in Python for service development and data integration tasks. Additional experience with Java, PL/SQL, and Bash scripting is a plus. Strong hands-on experience with AWS services: EC2, Lambda, RDS, S3, IAM, API Gateway, Kinesis. Expertise in PostgreSQL or Oracle, including stored procedures and performance tuning. Solid understanding of microservices architecture and serverless computing. Familiarity with Elasticsearch or other search platforms. Practical experience with CI/CD tools: GitLab, GitHub, AWS CodePipeline. Experience with Terraform, Docker, and cloud infrastructure setup and management. Preferred Qualifications: AWS Certification (e.g., AWS Certified Solutions Architect). Experience working in Agile/Scrum environments. Exposure to ETL pipeline development and data-driven backend systems. Understanding of Kubernetes, networking, and cloud security principles.
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering, ETL. Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Jaipur
On-site
ABOUT HAKKODA Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone’s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!. We are seeking a skilled and collaborative Sr. Data/Python Engineer with experience in the development of production Python-based applications (Such as Django, Flask, FastAPI on AWS) to support our data platform initiatives and application development. This role will initially focus on building and optimizing Streamlit application development frameworks, CI/CD Pipelines, ensuring code reliability through automated testing with Pytest , and enabling team members to deliver updates via CI/CD pipelines . Once the deployment framework is implemented, the Sr Engineer will own and drive data transformation pipelines in dbt and implement a data quality framework. Key Responsibilities: Lead application testing and productionalization of applications built on top of Snowflake - This includes implementation and execution of unit testing and integration testing - Automated test suites include use of Pytest and Streamlit App Tests to ensure code quality, data accuracy, and system reliability. Development and Integration of CI/CD pipelines (e.g., GitHub Actions, Azure DevOps, or GitLab CI) for consistent deployments across dev, staging, and production environments. Development and testing of AWS-based pipelines - AWS Glue, Airflow (MWAA), S3. Design, develop, and optimize data models and transformation pipelines in Snowflake using SQL and Python. Build Streamlit-based applications to enable internal stakeholders to explore and interact with data and models. Collaborate with team members and application developers to align requirements and ensure secure, scalable solutions. Monitor data pipelines and application performance, optimizing for speed, cost, and user experience. Create end-user technical documentation and contribute to knowledge sharing across engineering and analytics teams. Work in CST hours and collaborate with onshore and offshore teams. Qualifications, Skills & Experience 5+ years of experience in Data Engineering or Python based application development on AWS (Flask, Django, FastAPI, Streamlit) - Experience building data data-intensive applications on python as well as data pipelines on AWS in a must. Bachelor’s degree in computer science, Information Systems, Data Engineering, or a related field (or equivalent experience). Proficient in SQL and Python for data manipulation and automation tasks. Experience with developing and productionalizing applications built on Python based Frameworks such as FastAPI, Django, Flask. Experience with application frameworks such as Streamlit, Angular, React etc for rapid data app deployment. Solid understanding of software testing principles and experience using Pytest or similar Python frameworks. Experience configuring and maintaining CI/CD pipelines for automated testing and deployment. Familiarity with version control systems such as Gitlab . Knowledge of data governance, security best practices, and role-based access control (RBAC) in Snowflake. Preferred Qualifications: Experience with dbt (data build tool) for transformation modeling. Knowledge of Snowflake’s advanced features (e.g., masking policies, external functions, Snowpark). Exposure to cloud platforms (e.g., AWS, Azure, GCP). Strong communication and documentation skills. Benefits: Health Insurance Paid leave Technical training and certifications Robust learning and development opportunities Incentive Toastmasters Food Program Fitness Program Referral Bonus Program Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive. Ready to take your career to the next level? \uD83D\uDE80 \uD83D\uDCBB Apply today\uD83D\uDC47 and join a team that’s shaping the future!! Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here.
Posted 5 days ago
5.0 - 7.0 years
0 - 0 Lacs
Visakhapatnam
On-site
Job Description : Maintenance Incharge (Catering Industry - Multi-Kitchen Operations) Position Titl e: Maintenance Incharge / Head of Maintenance Engineering Department : Engineering & Maintenance Reports T o: Operations Manager / Asst General Manager Location: [All Location(s) - , Multi-Outlet Facility] Employment Type: Full-Time Mission of the Role To ensure the seamless, safe, and efficient operation of all kitchen equipment, utilities, and facility infrastructure across catering operations, minimizing downtime, ensuring compliance, and maximizing equipment lifespan through expert technical oversight, proactive maintenance planning, and hands-on leadership. Core Responsibilities Strategic Maintenance Leadership: Develop, implement, and oversee a comprehensive Preventive Maintenance (PM) program for all critical kitchen equipment (boilers, motors, grinders, exhausts, refrigeration) and facility systems across all designated kitchens. Create and manage the annual maintenance budget, prioritizing critical repairs and upgrades. Lead, mentor, and schedule the maintenance team (technicians, helpers), ensuring adequate coverage for all shifts and locations. Maintain detailed records (CMMS - Computerized Maintenance Management System preferred) of all maintenance activities, work orders, spare parts inventory, and equipment history. Technical Expertise & Troubleshooting (Critical Systems): Boilers: Possess in-depth knowledge of operation, maintenance (daily checks, water treatment, blowdowns), troubleshooting, safety protocols (including statutory compliance), and minor repairs of industrial catering boilers (steam/hot water). Understand pressure systems regulations. Motors & Drives: Expert in troubleshooting, repairing, and maintaining electric motors (specifically 2HP and above commonly found in mixers, grinders, exhaust fans, pumps), including understanding starters (DOL, Star-Delta), VFDs, bearings, alignment, and load testing. Exhaust Systems (Sukhad): Thorough understanding of commercial kitchen exhaust hoods, ductwork, fire suppression systems (Ansul), and extraction fans. Ensure optimal airflow, grease management, and compliance with fire safety regulations. Schedule and oversee deep cleaning. Refrigeration & Cold Rooms: Maintain optimal performance of walk-in cold rooms, freezers, chillers, refrigerators, and ice machines. Troubleshoot refrigerant issues (within permissible scope), compressors, condensers, evaporators, controls, and temperature monitoring systems. Understand HACCP implications of temperature failures. Grinders & Processing Equipment: Expertise in maintaining, troubleshooting, and repairing commercial meat grinders, vegetable cutters, mixers, blenders, and food processors. Focus on safety interlocks, blade sharpening/replacement, gearboxes, and drive mechanisms. Other Key Equipment: Oversee maintenance of ovens (convection, deck, combi), fryers, cooking ranges, dishwashers (conveyor, flight type), pasta cookers, bain-maries, hot cupboards, and associated gas/electric/steam lines. Operational Excellence & Compliance: Preventive Maintenance: Execute and supervise scheduled PM tasks rigorously to prevent breakdowns. Breakdown Management: Respond urgently to equipment failures in kitchens, diagnose faults accurately, perform repairs efficiently, or coordinate with external vendors when necessary to minimize disruption to food production. Spare Parts Management: Maintain optimal inventory levels of critical spare parts for key equipment. Source parts cost-effectively. Safety & Compliance: Ensure all work adheres to strict safety standards (LOTO, electrical safety, working at height, confined space if applicable), food safety regulations (preventing contamination during repairs), and local statutory requirements (boiler inspections, electrical certifications, fire safety). Vendor Management: Liaise with and oversee external contractors for specialized repairs, statutory inspections, and major overhauls. Ensure quality and cost control. Energy Efficiency: Identify and implement opportunities to improve energy efficiency of equipment (e.g., optimizing boiler operation, motor efficiency, refrigeration settings). Training & Communication: Train kitchen staff on the correct and safe basic operation and minor care (e.g., cleaning, reporting issues) of equipment. Train maintenance technicians on specific equipment and procedures. Communicate effectively with Kitchen Managers, Chefs, and Operations Management regarding maintenance schedules, downtime, and critical issues. Prepare regular reports on maintenance performance, downtime analysis, and cost tracking. Mandatory Qualifications & Experience Education: ITI (Electrical/Mechanical/Fitter) Diploma or equivalent. A Diploma/Degree in Mechanical/Electrical Engineering is highly preferred. Experience: Minimum 5-7 years of hands-on experience in maintenance, with at least 3 years specifically in the hospitality/catering industry or a heavy industrial setting with similar equipment (FMCG, Pharma plant kitchens). Proven experience leading a maintenance team is essential. Technical Skills (Non-Negotiable): Deep Practical Knowledge: Proven expertise in troubleshooting, repairing, and maintaining: Industrial Boilers (Operation, Maintenance, Safety) Electric Motors (2HP and above - Dismantling, Rewinding/Bearing Replacement, Alignment, Starter Circuits) Commercial Kitchen Exhaust Systems (Sukhad - Hoods, Ducts, Fans, Fire Systems) Refrigeration Systems & Walk-in Cold Rooms/Freezers (Compressors, Controls, Defrost, Glycol Systems) Heavy-Duty Grinders, Mixers, Cutters, and Food Processing Machinery. Strong Fundamentals: Excellent understanding of mechanical systems (gearboxes, bearings, belts, chains, pneumatics), electrical systems (single & three-phase power, controls, basic PLC understanding), and plumbing. Safety Focus: Thorough knowledge of relevant safety protocols (Electrical, LOTO, Pressure Vessels, Working at Height). Tools: Proficiency with hand tools, power tools, electrical testing equipment (multimeter, clamp meter, megger), and welding/gas cutting (advantageous). Certifications (Highly Desirable): Boiler Operation Engineer (BOE) certificate or equivalent (mandatory in some jurisdictions). Refrigeration handling certificate (type depending on local regulations). Certified Maintenance & Reliability Professional (CMRP) or similar. Electrical License (if applicable locally). Soft Skills: Strong leadership and team management abilities. Excellent problem-solving and analytical skills under pressure. Outstanding communication (verbal & written) and interpersonal skills. Proactive, organized, and meticulous with documentation. Ability to prioritize effectively in a fast-paced, 24/7 environment. Basic computer literacy (MS Office, CMMS software). Working Conditions Primarily based in industrial kitchen/production environments (hot, humid, noisy). Requires frequent standing, walking, bending, lifting (up to 25kg), and working in confined spaces. On-call availability for emergencies outside normal hours (nights, weekends, holidays) is essential. May require travel between multiple kitchen locations if applicable. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Health insurance Leave encashment Provident Fund Schedule: Day shift Morning shift Work Location: In person
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Senior Software Engineer Department: IDP About Us HG Insights is the global leader in technology intelligence, delivering actionable AI driven insights through advanced data science and scalable big data solutions. Our Big Data Insights Platform processes billions of unstructured documents and powers a vast data lake, enabling enterprises to make strategic, data-driven decisions. Join our team to solve complex data challenges at scale and shape the future of B2B intelligence. What You’ll Do: Design, build, and optimize large-scale distributed data pipelines for processing billions of unstructured documents using Databricks, Apache Spark, and cloud-native big data tools Architect and scale enterprise-grade big-data systems, including data lakes, ETL/ELT workflows, and syndication platforms for customer-facing Insights-as-a-Service (InaaS) products. Collaborate with product teams to develop features across databases, backend services, and frontend UIs that expose actionable intelligence from complex datasets. Implement cutting-edge solutions for data ingestion, transformation, and analytics using Hadoop/Spark ecosystems, Elasticsearch, and cloud services (AWS EC2, S3, EMR). Drive system reliability through automation, CI/CD pipelines (Docker, Kubernetes, Terraform), and infrastructure-as-code practices. What You’ll Be Responsible For Leading the development of our Big Data Insights Platform, ensuring scalability, performance, and cost-efficiency across distributed systems. Mentoring engineers, conducting code reviews, and establishing best practices for Spark optimization, data modeling, and cluster resource management. Building & Troubleshooting complex data pipeline issues, including performance tuning of Spark jobs, query optimization, and data quality enforcement. Collaborating in agile workflows (daily stand-ups, sprint planning) to deliver features rapidly while maintaining system stability. Ensuring security and compliance across data workflows, including access controls, encryption, and governance policies. What You’ll Need BS/MS/Ph.D. in Computer Science or related field, with 5+ years of experience building production-grade big data systems. Expertise in Scala/Java for Spark development, including optimization of batch/streaming jobs and debugging distributed workflows. Proven track record with: Databricks, Hadoop/Spark ecosystems, and SQL/NoSQL databases (MySQL, Elasticsearch). Cloud platforms (AWS EC2, S3, EMR) and infrastructure-as-code tools (Terraform, Kubernetes). RESTful APIs, microservices architectures, and CI/CD automation37. Leadership experience as a technical lead, including mentoring engineers and driving architectural decisions. Strong understanding of agile practices, distributed computing principles, and data lake architectures. Airflow orchestration (DAGs, operators, sensors) and integration with Spark/Databricks 7+ years of designing, modeling and building big data pipelines in an enterprise work setting. Nice-to-Haves Experience with machine learning pipelines (Spark MLlib, Databricks ML) for predictive analytics. Knowledge of data governance frameworks and compliance standards (GDPR, CCPA). Contributions to open-source big data projects or published technical blogs/papers. DevOps proficiency in monitoring tools (Prometheus, Grafana) and serverless architectures. Show more Show less
Posted 5 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less
Posted 5 days ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
HackerOne is a global leader in offensive security solutions. Our HackerOne Platform combines AI with the ingenuity of the largest community of security researchers to find and fix security, privacy, and AI vulnerabilities across the software development lifecycle. The platform offers bug bounty, vulnerability disclosure, pentesting, AI red teaming, and code security. We are trusted by industry leaders like Amazon, Anthropic, Crypto.com, General Motors, GitHub, Goldman Sachs, Uber, and the U.S. Department of Defense. HackerOne was named a Best Workplace for Innovators by Fast Company in 2023 and a Most Loved Workplace for Young Professionals in 2024. HackerOne Values HackerOne is dedicated to fostering a strong and inclusive culture. HackerOne is Customer Obsessed and prioritizes customer outcomes in our decisions and actions. We Default to Disclosure by operating with transparency and integrity, ensuring trust and accountability. Employees, researchers, customers, and partners Win Together by fostering empowerment, inclusion, respect, and accountability. Senior Analytics Engineer, DataOne Location: Pune, India This role requires the candidate to be based in Pune and work from an office 4 or 5 days a week. Please only apply if you're okay with these requirements. *** Position Summary HackerOne is seeking a Senior Analytics Engineer to join our DataOne team. You will lead the discovery, architecture, and development of high-impact, high-performance, scalable source of truth data marts and data products. Joining our growing, distributed organization, you'll be instrumental in building the foundation that powers HackerOne's one source of truth. As a Senior Analytics Engineer, you'll be able to lead challenging projects and foster collaboration across the company. Leveraging your extensive technological expertise, domain knowledge, and dedication to business objectives, you'll drive innovation to propel HackerOne forward. DataOne democratizes source-of-truth information and insights to enable all Hackeronies to ask the right questions, tell cohesive stories, and make rigorous decisions so that HackerOne can delight our Customers and empower the world to build a safer internet . The future is one where every Hackeronie is a catalyst for positive change , driving data-informed innovation while fostering our culture of transparency, collaboration, integrity, excellence, and respect for all . What You Will Do Your first 30 days will focus on getting to know HackerOne. You will join your new squad and begin onboarding - learn our technology stack (Python, Airflow, Snowflake, DBT, Meltano, Fivetran, Looker, AWS), and meet our Hackeronies. Within 60 days, you will deliver impact on a company level with consistent contribution to high-impact, high-performance, scalable source of truth data marts and data products. Within 90 days, you will drive the continuous evolution and innovation of data at HackerOne, identifying and leading new initiatives. Additionally, you foster cross-departmental collaboration to enhance these efforts. Deliver impact by developing the roadmap for continuously and iteratively launching high-impact, high-performance, scalable source of truth data marts and data products, and by leading and delivering cross-functional product and technical initiatives. Be a technical paragon and cross-functional force multiplier, autonomously determining where to apply focus, contributing at all levels, elevating your squad, and designing solutions to ambiguous business challenges, in a fast-paced early-stage environment. Drive continuous evolution and innovation, the adoption of emerging technologies, and the implementation of industry best practices. Champion a higher bar for discoverability, usability, reliability, timeliness, consistency, validity, uniqueness, simplicity, completeness, integrity, security, and compliance of information and insights across the company. Provide technical leadership and mentorship, fostering a culture of continuous learning and growth. Minimum Qualifications 6+ years experience as an Analytics Engineer, Business Intelligence Engineer, Data Engineer, or similar role w/ proven track record of launching source of truth data marts. 6+ years of experience building and optimizing data pipelines, products, and solutions. Must be flexible to align with ocassional evening meetings in USA timezone. Extensive experience working with various data technologies and tools such as Airflow, Snowflake, Meltano, Fivetran, DBT, and AWS. Expert in SQL for data manipulation in a fast-paced work environment. Expert in creating compelling data stories using data visualization tools such as Looker, Tableau, Sigma, Domo, or PowerBI. Proven track record of having substantial impact across the company, as well as externally for the company, demonstrating your ability to drive positive change and achieve significant results. English fluency, excellent communication skills, and can present data-driven narratives in verbal, presentation, and written formats. Passion for working backwards from the Customer and empathy for business stakeholders. Experience shaping the strategic vision for data. Experience working with Agile and iterative development processes. Preferred Qualifications Strong proficiency in at least one data programming language such as Python or R. Experience working within and with data from business applications such as Salesforce, Clari, Gainsight, Workday, GitLab, Slack, or Freshservice. Proven track record of driving innovation, adopting emerging technologies and implementing industry best practices. Thrive on solving for ambiguous problem statements in an early-stage environment. Experience designing advanced data visualizations and data-rich interfaces in Figma or equivalent. Compensation Bands: Pune, India ₹3.7M – ₹4.6M Offers Equity Job Benefits: Health (medical, vision, dental), life, and disability insurance* Equity stock options Retirement plans Paid public holidays and unlimited PTO Paid maternity and parental leave Leaves of absence (including caregiver leave and leave under CO's Healthy Families and Workplaces Act) Employee Assistance Program Flexible Work Stipend Eligibility may differ by country We're committed to building a global team! For certain roles outside the United States, U.K., and the Netherlands, we partner with Remote.com as our Employer of Record (EOR). Visa/work permit sponsorship is not available. Employment at HackerOne is contingent on a background check. HackerOne is an Equal Opportunity Employer in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, pregnancy, disability or veteran status, or any other protected characteristic as outlined by international, federal, state, or local laws. This policy applies to all HackerOne employment practices, including hiring, recruiting, promotion, termination, layoff, recall, leave of absence, compensation, benefits, training, and apprenticeship. HackerOne makes hiring decisions based solely on qualifications, merit, and business needs at the time. For US based roles only: Pursuant to the San Francisco Fair Chance Ordinance, all qualified applicants with arrest and conviction records will be considered for the position. Show more Show less
Posted 5 days ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description RESPONSIBILITIES Design, develop, implement, test, and maintain automated test suites and frameworks for AI/ML pipelines. Collaborate closely with ML engineers and data scientists to understand model architectures and data workflows. Develop and execute test plans, test cases, and test scripts to identify software defects in AI/ML applications. Ensure end-to-end quality of AI/ML solutions, including data integrity, model performance, and system integration. Implement continuous integration and continuous deployment (CI/CD) processes for ML pipelines. Conduct performance and scalability testing for AI/ML systems. Document and track software defects using bug-tracking systems, and report issues to development teams. Participate in code reviews and provide feedback on testability and quality. Help foster a culture of quality and continuous improvement within the ML engineering group. Stay updated with the latest trends and best practices in AI/ML testing and quality assurance. Must Haves: Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in quality assurance, specifically testing AI/ML applications. Experience with the following: Strong programming skills in Python (experience with libraries like PyTest or unittest). Familiarity with machine learning frameworks (TensorFlow, PyTorch, or scikit-learn). Experience with test automation tools and frameworks. Knowledge of CI/CD tools (Jenkins, GitLab CI, or similar). Experience with containerization technologies like Docker and orchestration systems like Kubernetes. Proficient in Linux operating systems. Familiarity with version control systems like Git. Strong understanding of software testing methodologies and best practices. Excellent analytical and problem-solving skills. Excellent communication and collaboration skills. Bonus Attributes: Experience with testing data pipelines and ETL processes. Cloud platform experience; GCP, AWS or Azure are acceptable. Knowledge of big data technologies like Apache Spark, Kafka, or Airflow. Experience with performance testing tools. Understanding of data science concepts and statistical analysis. Certifications in software testing or cloud technologies. Abilities: Ability to work with a high level of initiative, accuracy, and attention to detail. Ability to prioritize multiple assignments effectively. Ability to meet established deadlines. Ability to successfully, efficiently, and professionally interact with staff and customers. Excellent organization skills. Critical thinking ability ranging from moderately to highly complex. Flexibility in meeting the business needs of the customer and the company. Ability to work creatively and independently with latitude and minimal supervision. Ability to utilize experience and judgment in accomplishing assigned goals. Experience in navigating organizational structure. Show more Show less
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.