Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
India
Remote
Job Title: Senior Data Engineer (Remote) Location: Remote Employment Type: Full-Time Experience Level: Senior (5+ years) About the Role We are seeking a highly skilled Senior Data Engineer to join our growing data team. As a key contributor, you will design and build robust, scalable data pipelines and systems that power analytics and decision-making across the organization. You will work closely with data scientists, analysts, and product teams to ensure data accuracy, availability, and performance. Key Responsibilities Design, build, and maintain scalable and reliable ETL/ELT data pipelines using Python and SQL. Develop and manage data infrastructure and workflows on AWS (e.g., S3, Lambda, Glue, Redshift, EMR, Athena). Ensure high data quality and implement best practices for data governance and security. Automate data ingestion from diverse structured and unstructured sources. Optimize and monitor pipeline performance and resolve production issues. Collaborate with stakeholders to define data requirements and deliver actionable data products. Maintain and document architecture, data models, and pipelines. Mentor junior engineers and contribute to engineering best practices and team culture. Required Qualifications 5+ years of experience in data engineering or a related field. Strong proficiency in Python for data processing and scripting. Advanced knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL). Hands-on experience with AWS data services: S3, Glue, Redshift, Lambda, Athena, etc. Experience with orchestration tools (e.g., Airflow, AWS Step Functions). Solid understanding of data warehousing, data modeling, and ETL best practices. Familiarity with version control (e.g., Git) and CI/CD pipelines. Preferred Qualifications Experience with infrastructure as code tools (e.g., Terraform, CloudFormation). Exposure to real-time data processing frameworks (e.g., Kafka, Spark, Kinesis). Background in big data technologies and distributed computing. Knowledge of data privacy regulations (GDPR, CCPA) and compliance practices. Familiarity with dashboarding/BI tools (e.g., Tableau, Looker, QuickSight). Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Overview About this role The Associate - Data Platform Engineer engages in projects from the start, including refining requirements and designing, developing, testing, deploying, and maintaining Enterprise Data Platform (EDP) enabling core framework components. Key tasks include automating data pipelines, supporting teams using the framework, adhering to data platform standards, and enhancing performance and scalability. Responsibilities Actively participate in chapter ceremony meetings and contribute to project planning and estimation. Coordinate work with product managers, data owners, platform teams, and other stakeholders throughout the SDLC cycle. Use Airflow, Python, Snowflake, dbt, and related technologies to enhance and maintain EDP acquisition, ingestion, processing, orchestration and DQ frameworks. Adopt new tools and technologies to enhance framework capabilities. Build and conduct end-to-end tests to ensure production operations run successfully after every release cycle. Document and present accomplishments and challenges to internal and external stakeholders. Demonstrate deep understanding of modern data engineering tools and best practices. Design and build solutions which are performant, consistent, and scalable. Contribute to design decisions for complex systems. Provide L2 / L3 support for technical and/or operational issues. Qualifications At least 5+ years' experience as a data engineer Expertise with SQL, stored procedures, UDFs Advanced level Python programming or Advanced Core Java programming. Experience with Snowflake or similar cloud native databases Experience with orchestration tools, especially Airflow Experience with declarative transformation tools like dbt Experience in Azure services, especially ADLS (or equivalent) Exposure to real time streaming platforms and message brokers (e.g., Snowpipe Streaming, Kafka) Experience with Agile development concepts and related tools (ADO, Aha) Experience conducting root cause analysis and solve issues Experience with performance tuning Excellent written and verbal communication skills Ability to operate in a matrixed organization and fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances Bachelor's degree in computer science is strongly preferred Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ninja Van is a late-stage logtech startup that is disrupting a massive industry with innovation and cutting edge technology. Launched 2014 in Singapore, we have grown rapidly to become one of Southeast Asia's largest and fastest-growing express logistics companies. Since our inception, we’ve delivered to 100 million different customers across the region with added predictability, flexibility and convenience. Join us in our mission to connect shippers and shoppers across Southeast Asia to a world of new possibilities. More About Us We process 250 million API requests and 3TB of data every day. We deliver more than 2 million parcels every day. 100% network coverage with 2600+ hubs and stations in 6 SEA markets (Singapore, Malaysia, Indonesia, Thailand, Vietnam and Philippines), reaching 500 million consumers. 2 Million active shippers in all e-commerce segments, from the largest marketplaces to the individual social commerce sellers. Raised more than US$500 million over five rounds. We are looking for world-class talent to join our crack team of engineers, product managers and designers. We want people who are passionate about creating software that makes a difference to the world. We like people who are brimming with ideas and who take initiative rather than wait to be told what to do. We prize team-first mentality, personal responsibility and tenacity to solve hard problems and meet deadlines. As part of a small and lean team, you will have a very direct impact on the success of the company. Roles & Responsibilities Design, develop, and maintain Ninja Van's infrastructure for data streaming, processing, and storage . Build tools to ensure effective maintenance and monitoring of the data infrastructure. Contribute to key architectural decisions for data pipelines and lead the implementation of major initiatives. Collaborate with stakeholders to deliver scalable and high-performance solutions for data requirements, including extraction, transformation, and loading (ETL) from diverse data sources. Enhance the team's data capabilities by sharing knowledge , enforcing best practices , and promoting data-driven decision-making . Develop and enforce Ninja Van's data retention policies and backup strategies, ensuring data is stored redundantly and securely. Requirements Solid computer science fundamentals, excellent problem-solving skills, and a strong understanding of distributed computing principles. At least 8+ years of experience in a similar role, with a proven track record of building scalable and high-performance data infrastructure using Python, PySpark, Spark, and Airflow. Expert-level SQL knowledge and extensive experience working with both relational and NoSQL databases. Advanced knowledge of Apache Kafka, along with demonstrated proficiency in Hadoop v2, HDFS, and MapReduce. Hands-on experience with stream-processing systems (e.g., Storm, Spark Streaming), big data querying tools (e.g., Pig, Hive, Spark), and data serialization frameworks (e.g., Protobuf, Thrift, Avro). [Good to have] Familiarity with infrastructure-as-code technologies like Terraform, Terragrunt, Ansible, or Helm. Don’t worry if you don’t have this experience—what matters is your interest in learning! [Good to have] Experience with Change Data Capture (CDC) technologies such as Maxwell or Debezium. Bachelor’s or Master’s degree in Computer Science or a related field from a top university. Tech Stack Backend: Play (Java 8+), Golang, Node.js , Python, FastAPI Frontend: AngularJS, ReactJS Mobile: Android, Flutter, React Native Cache: Hazelcast, Redis Data storage: MySQL, TiDB, Elasticsearch, Delta Lake Infrastructure monitoring: Prometheus, Grafana Orchestrator: Kubernetes Containerization: Docker, Containerd Cloud Provider: GCP, AWS Data pipelines: Apache Kafka, Spark Streaming, Maxwell/Debezium, PySpark, TiCDC Workflow manager: Apache Airflow Query engines: Apache Spark, Trino Submit a job application By applying to the job, you acknowledge that you have read, understood and agreed to our Privacy Policy Notice (the “Notice”) and consent to the collection, use and/or disclosure of your personal data by Ninja Logistics Pte Ltd (the “Company”) for the purposes set out in the Notice. In the event that your job application or personal data was received from any third party pursuant to the purposes set out in the Notice, you warrant that such third party has been duly authorised by you to disclose your personal data to us for the purposes set out in the the Notice. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decision-making and automation. Years of Experience: Candidates with 15+ years of hands on experience Required Skills Must Have: Solid knowledge and experience of supervised, unsupervised machine learning algorithms. For e.g (but not limited to): linear regressions, bayesian regressions, multi objective optimization techniques, classifiers, cluster analysis, dimension reduction etc. Understanding of technicality used for retail analytics across loyalty, customer analytics, assortment, promotion and marketing Good knowledge of statistics For e.g: statistical tests & distributions Experience in Data analysis For e.g: data cleansing, standardization and data preparation for the machine learning use cases Experience in machine learning frameworks and tools (For e.g. scikit-learn, mlr, caret, H2O, TensorFlow,, Pytorch, MLlib) Advanced level programming in SQL and Python/Pyspark to guide teams Expertise with visualization tools For e.g: Tableau, PowerBI, AWS QuickSight etc. Nice To Have Working knowledge of containerization ( e.g. AWS EKS, Kubernetes), Dockers and data pipeline orchestration (e.g. Airflow) Experience with model explainability and interpretability techniques Multi-task and manage multiple deadlines. Responsible for incorporating client/user feedback into the Product Ability to think through complex user scenarios and design simple yet effective user interactions Good Communication and presentation skills Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About YUBI Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfilment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India’s fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. All 5 of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Loans – Term loans and working capital solutions for enterprises. Yubi Invest – Bond issuance and investments for institutional and retail participants. Yubi Pool– End-to-end securitisations and portfolio buyouts. Yubi Flow – A supply chain platform that offers trade financing solutions. Yubi Co.Lend – For banks and NBFCs for co-lending partnerships. Currently, we have boarded over 4000+ corporates, 350+ investors and have facilitated debt volumes of over INR 40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 650+ like-minded individuals today, who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come, join the club to be a part of our epic growth story. About The Role This role requires a well-rounded data engineer with hands-on experience in data processing technologies, a good understanding of data modeling concepts, and the ability to collaborate effectively with various stakeholders. The willingness to adapt to flexible working hours demonstrates a commitment to supporting the continuous operation of data pipelines and meeting business needs. Responsibilities: Build Data Pipelines: Utilize PySpark and Python to construct efficient and scalable data pipelines. Integrate data from multiple source systems into a unified target system. Orchestrate Pipelines with Airflow: Use Apache Airflow to orchestrate and schedule data pipelines, ensuring timely and reliable execution. Enhance Existing Pipelines: Understand existing data pipelines and make enhancements based on evolving business requirements. Implement improvements to optimize performance and maintainability. Debugging and Root Cause Analysis: Troubleshoot and resolve any failures in data pipelines promptly. Conduct root cause analysis for pipeline failures and implement corrective measures. Collaboration with Stakeholders: Work closely with various stakeholders, both within and across teams. Communicate effectively to understand and address business needs related to data processing. Weekend and Shift Support: Be available to work on weekends and in shifts if necessary to provide support for business operations. Requirements Experience: 3-5 years of experience as a data engineer, demonstrating a solid understanding of data engineering principles. Technical Skills: Proficient in SQL, Python, and PySpark for designing and implementing data solutions. Knowledge of data warehousing techniques and dimensional modeling. Orchestration Tools: Experience with Apache Airflow for orchestrating complex data workflows. Familiarity with containerization using Docker and version control systems. Data Modelling and Transformation: Exhibit strong proficiency in data modeling techniques, emphasizing expertise in designing and implementing effective data structures. Knowledge of DBT (Data Build Tool) for transforming and modeling data. Cloud Platform: AWS knowledge is a plus, showcasing familiarity with cloud-based data services and infrastructure. Show more Show less
Posted 1 week ago
12.0 years
5 - 7 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. Analytics Foundations Enabler IT team provides the required IT platform for the model developers to develop / train models and eventually deploy them in an automated way into production. Analytics Foundations Enabler IT team ensures these models are packaged such that they are exposed as ‘Model as a Service’ to be consumed by various business functions as part of their data driven decisioning use cases. We are seeking a talented and experienced POD Lead to join our dynamic team, with experience in software development and a strong background in Python, GCP, Angular, and Kubernetes. The ideal candidate will have a proven track record of technical leadership, stakeholder management, and excellent communication skills. This role will involve working closely with cross-functional teams to deliver high-quality software solutions while driving innovation and continuous improvement. In this role, you will: Lead and manage a team of software engineers, providing technical guidance, mentorship, and support to ensure the successful delivery of software projects. Collaborate with product managers, architects, and other stakeholders to define and prioritize software requirements, ensuring alignment with business objectives. Conceptualise, design, develop and reuse effective engineering design, patterns & frameworks using Python, GCP, Angular, and Kubernetes, adhering to best practices and industry standards. Foster a culture of continuous improvement, encouraging the team to identify and implement process improvements and innovative solutions. Act as an IT Service Owner and ensure compliance across Incident, Problem, Change/Release management and other associated IT controls Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Drive operational, delivery and engineering excellence across the pod teams. Be accountable for production and for delivery. Requirements To be successful in this role, you should meet the following requirements: 12+ years of experience in software development, with a strong background in Python, Java Springboot, GCP, Angular, and Kubernetes, awareness of Model Life Cycle Management & MLOPs will be a plus. Proven experience in technical leadership, managing software development teams, and delivering complex software projects. Excellent stakeholder management and communication skills, with the ability to effectively convey complex technical concepts to both technical and non-technical audiences. Software engineering skills: Microservice architecture patterns, frameworks like FastAPI, REST APIs and experience around API Security Standards, API Gateway, Service Mesh Devops skills: Proficiency in tools such as Docker, Kubernetes, Helm, Terraform Orchestrating data pipelines: Setting up and automating data pipelines using tools such as Airflow and familiarity with data processing technologies including NumPy, Pandas, Amazon S3, Kubeflow, Dataflow Expertise in monitoring and observability technologies like Prometheus, Appdynamics, Splunk, Jaeger, Kiali, Open Telemetry. GCP Experience around management of GKE clusters, Good to have skills: Programming, working knowledge of machine learning algorithms and frameworks, like scikit learn, PyTorch; Familiarity on industry solutions like Google Vertex AI You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 1 week ago
2.0 years
2 - 7 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-216752 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 06, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. We are seeking a highly skilled Machine Learning Engineer with a strong MLOps background to join our team. You will play a pivotal role in building and scaling our machine learning models from development to production. Your expertise in both machine learning and operations will be essential in creating efficient and reliable ML pipelines. Roles & Responsibilities: Collaborate with data scientists to develop, train, and evaluate machine learning models. Build and maintain MLOps pipelines, including data ingestion, feature engineering, model training, deployment, and monitoring. Leverage cloud platforms (AWS, GCP, Azure) for ML model development, training, and deployment. Implement DevOps/MLOps best practices to automate ML workflows and improve efficiency. Develop and implement monitoring systems to track model performance and identify issues. Conduct A/B testing and experimentation to optimize model performance. Work closely with data scientists, engineers, and product teams to deliver ML solutions. Guide and mentor junior engineers in the team Stay updated with the latest trends and advancements What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree and 2 years of Computer Science, Statistics, and Data Science, Machine Learning experience OR Master’s degree and 8 to 10 years of Computer Science, Statistics, and Data Science, Machine Learning experience OR Bachelor’s degree and 10 to 14 years of Computer Science, Statistics, and Data Science, Machine Learning experience OR Diploma and 14 to 18 years of years of Computer Science, Statistics, and Data Science, Machine Learning experience Preferred Qualifications: Must-Have Skills: Strong foundation in machine learning algorithms and techniques Experience in MLOps practices and tools (e.g., MLflow, Kubeflow, Airflow); Experience in DevOps tools (e.g., Docker, Kubernetes, CI/CD) Proficiency in Python and relevant ML libraries (e.g., TensorFlow, PyTorch, Scikit-learn) Outstanding analytical and problem-solving skills; Ability to learn quickly; Excellent communication and interpersonal skills Good-to-Have Skills: Experience with big data technologies (e.g., Spark), and performance tuning in query and data processing Experience with data engineering and pipeline development Experience in statistical techniques and hypothesis testing, experience with regression analysis, clustering and classification Knowledge of NLP techniques for text analysis and sentiment analysis Experience in analyzing time-series data for forecasting and trend analysis Familiar with AWS, Azure, or Google Cloud; Familiar with Databricks platform for data analytics and MLOps Professional Certifications Cloud Computing and Databricks certificate preferred Soft Skills: Excellent analytical and fixing skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
8.0 years
5 - 7 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. Analytics Foundations Enabler IT team provides the required IT platform for the model developers to develop / train models and eventually deploy them in an automated way into production. Analytics Foundations Enabler IT team ensures these models are packaged such that they are exposed as ‘Model as a Service’ to be consumed by various business functions as part of their data driven decisioning use cases. We are seeking a talented and experienced Technical Lead to join our dynamic team, with experience in software development and a strong background in Python, GCP, Angular, and Kubernetes. The ideal candidate will have a proven track record of technical leadership, stakeholder management, and excellent communication skills. This role will involve working closely with cross-functional teams to deliver high-quality software solutions while driving innovation and continuous improvement. In this role, you will: Requirement analysis: Analyze the requirements. Collaborate with business in discussing the feasibility, finalizing the requirements, provide inputs on estimating the effort and schedule, conduct sprint planning sessions, with well-defined user stories and story points and bring consensus on the deadlines. Impact Analysis: Identify the dependencies and blockers beforehand and prepare a remediation plan for them. This may require engaging with other teams within and across Business lines, articulate the impact at their end, changes required and the expectation on their involvement during various project execution phases. Design: Develop / Review the Technical Design. The design/architecture must conform to the department and organization’s tactical and strategic objectives. Raise issues if any in advance to the concerned teams/Business. Coding: Analyze, develop/review code as per specifications. Ensure to have all the code to be in line with the defined coding standards and best practices. Reviews: Conduct review of design/code/test plan and test results. Fix any defects in line with shift left philosophy. Ensure the quality of deliveries and ensure conformance to the outlined processes and practices. Testing: Engage with source and downstream interface teams. Deliver well structured, maintainable and fully tested systems to time and budget. Implementation: Conduct Release planning sessions near the end of each sprint, prepare/review plan for implementation and ensure smooth execution of releases. Support: Provide post implementation support. Participate in the 24x7 on call support duties and own the responsibility for fixing any and all events in production. Audits: Must have better understanding of SOX audit requirements and executions. Ensure full compliance with SOX and various other audit requirements. Conduct the sprint review during the sprint execution and conduct sprint retrospective sessions post implementation with the team, update and maintain documentation for team processes, best practices, and software run books. Intuitively coalesce towards problems with an open mind, within the context of application and team. Should always be able to welcome and accommodate changing requirements, even late in the development to provide our customers a competitive advantage. Collaborate with globally located cross functional team in building customer-centric products. Requirements To be successful in this role, you should meet the following requirements: 8+ years of experience in software development, with a strong background in Python, GCP, Angular, and Kubernetes, awareness of Model Life Cycle Management & MLOPs will be a plus. Excellent stakeholder management and communication skills, with the ability to effectively convey complex technical concepts to both technical and non-technical audiences. Software engineering skills: Microservice architecture patterns, frameworks like FastAPI, REST APIs and experience around API Security Standards, API Gateway, Service Mesh Knowledge on Automation Testing Cucumber, PyTest or equivalent, BDD frameworks Devops skills: Proficiency in tools such as Docker, Kubernetes, Helm, Terraform Vast experience around Airflow installation on Kubernetes, configuration, DAG setup, performance optimization Experience of React or Vue Javascript framework, as well as JQuery and Bootstrap4 is a plus Expertise in monitoring and observability technologies like Prometheus, Appdynamics, Splunk, Jaeger, Kiali, Open Telemetry. GCP Experience around management of GKE clusters, Good to have skills: Programming, working knowledge of machine learning algorithms and frameworks, like scikit learn, PyTorch; Familiarity on industry solutions like Google Vertex AI You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 1 week ago
10.0 years
6 - 9 Lacs
Hyderābād
On-site
Lead, Application Development Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 316190 Job Description About The Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! You’ll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, you’ll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, you’ll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. What’s in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What We’re Looking For: We’re seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technology? Join us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316190 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India
Posted 1 week ago
3.0 years
0 Lacs
Hyderābād
On-site
Core Skills Secondary Skills Bachelor's in Computer Science, Computer Engineering or related field 5+ yrs. Development experience with Spark (PySpark), Python and SQL. Extensive knowledge building data pipelines Hands on experience with Databricks Devlopment Strong experience with Strong experience developing on Linux OS. Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Solid understanding of distributed systems, data structures, design principles. Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). Comfortable communicating with teams via showcases/demos. Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. Actively migrate use cases from our on premises Data Lake to Databricks on GCP. Collaborate with Product Management and business partners to understand use case requirements and reporting. Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . Document and showcase feature designs/workflows. Participate in team meetings and discussions around product development. Stay up to date on industry latest industry trends and design patterns. 3+ years experience with GIT. 3+ years experience with CI/CD (e.g. Azure Pipelines). Experience with streaming technologies, such as Kafka, Spark. Experience building applications on Docker and Kubernetes. Cloud experience (e.g. Azure, Google). Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
8.0 years
5 - 9 Lacs
Gurgaon
On-site
Python Developer Gurgaon, India Information Technology 316038 Job Description About The Role: Grade Level (for internal use): 10 T he Team: Financial Risk Analytics at S&P Global provides products and solutions to financial institutions to measure and manage their counterparty credit risk, market risk, regulatory risk capital and derivative valuation adjustments. Using the latest analytics and technology such as a fully vectorized pricing library, Machine Learning and a Big Data stack for scalability, our products and solutions are used by the largest tier-one banks to smaller niche firms. Our products are available deployed, in the cloud or can be run as a service. We have a need for an enthusiastic and skilled Senior Python developer who is interested in learning about quantitative analytics and perhaps looking to make a career at the intersection of Financial Analytics, Big Data and Mathematics! The Impact: You will be working on a strategic component that allows clients to on-demand extract data required for pricing and risk calculations. This is an essential entry point to a risk calculation which requires speed to market and good design to drive efficient and robust workflows. What’s in it for y ou: The successful candidate will gain exposure to risk analytics and latest trending technology that allows you to grow into a hybrid role specializing in both financial markets and technology – a highly rewarding, challenging, and marketable position to gain skills in. Responsibilities: The successful candidate will work on the Market Risk solution with a technology stack that is best of breed, involving Python 3.10+, Airflow, Pandas, NumPy, ECS (AWS). You will join a fast-paced, dynamic team environment, building commercial products that are at the heart of the business and contributing directly to revenue generation. Design and implement end to end applications in Python with an emphasis on efficiently writing functions on large datasets. Interpret and analyse business use-cases and feature requests into technical designs and development tasks. Participate in regular design and code review meetings. Be a responsive team player in system architecture and design discussions. Be proud of the high quality of your own work. Always follow quality standards (unit tests, integration tests and documented code) Happy to coach and mentor junior engineers. Be delivery focused, have a passion for technology and enjoy offering new ideas and approaches. Demonstrable technical capacity in understanding technical deliveries and dependencies. Strong experience in working in software engineering projects in an Agile manner. What We’re Looking For: Bachelor’s degree in computer science, Engineering, or a related discipline, or equivalent experience Computer Science and Software Engineering: Strong software development experience Minimum 8 years' experience in developing applications using Python. Experience using Python 3.10+ Core Python with rich knowledge in OO methodologies and design. Experience writing python code that is scalable and performant. Experience/exposure to complex data types when designing and anticipating issues that impact performance (under ETL processes) by generating metrics using industry adopted profiling tools during development. Experience working on AWS, ECS, S3 and ideally MWAA (hosted Airflow on AWS) Experience working in data engineering/orchestration and scalable efficient flow design. Experience in developing data pipelines using Airflow. Good working competency in Docker, Git, Linux Good working knowledge of Pandas and NumPy Understanding of CI/CD pipelines Test frameworks. Agile and XP (Scrum, Kanban, TDD) Experience with cloud-based infrastructures, preferably with AWS. Fluent in English Passionate individual who thrives development, data and is hands on. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316038 Posted On: 2025-05-27 Location: Gurgaon, Haryana, India
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
About Highspot Highspot is a software product development company and a recognized global leader in the sales enablement category, leveraging cutting-edge AI and GenAI technologies at the core of its robust Software-as-a-Service (SaaS) platform. Highspot is revolutionizing how millions of individuals work worldwide. Through its AI-powered platform, Highspot drives enterprise transformation to empower sales teams through intelligent content management, training, contextual guidance, customer engagement, meeting intelligence, and actionable analytics. The Highspot platform delivers advanced features tailored to business needs, in a modern design that sales and marketing executives appreciate and is the #1 rated sales enablement platform on G2 Crowd. While headquartered in Seattle, Highspot has expanded its footprint across America, Canada, the UK, Germany, Australia, and now India, solidifying its presence in the Asia Pacific markets. About The Role As a Senior Data Engineer, you will be responsible for the end-to-end data pipeline, ensuring the reliability, efficiency, and scalability of our data systems. You will collaborate closely with cross-functional teams, including data scientists, analysts, and software engineers, to develop robust data solutions that empower data-driven decision-making. Responsibilities Create optimal data pipeline architecture Develop, and maintain end-to-end scalable data infrastructure and pipelines. Identify, design, and implement process improvements, automating manual processes and optimizing data delivery. Assist internal stakeholders with data-related technical issues and support their data infrastructure needs. Develop data tools for analytics and data scientists, contributing to product innovation. Collaborate with data and analytics experts to enhance functionality in data systems. Design and implement data models, ensuring integrity and consistency. Identify and resolve performance bottlenecks, optimizing queries and processing. Implement data governance best practices for quality, security, and compliance. Stay informed about emerging technologies, contribute to tool selection, and enhance data infrastructure. Create and maintain comprehensive documentation for data engineering processes and workflows. Drive the team's data strategy, technical roadmap, and data storage solutions. Empower the team for self-servicing and efficient troubleshooting. Required Qualifications Bachelor’s degree or equivalent experience. 7+ years of experience using SQL in an advanced capacity in query authoring, tuning, and identifying useful abstractions. 5 years of hands-on experience in Python, demonstrating proficiency in data wrangling and object-oriented programming Expertise in designing, creating, managing, and utilizing large datasets. Ability to build efficient, flexible, extensible, and scalable ETL and reporting solutions. Root cause analysis experience on internal and external data and processes. Cloud-based database platforms experience, preferably Snowflake. 3+ years of git experience for version control and collaboration. 3+ years of experience working with AWS cloud technology. 3+ years of experience working with workflow management platforms like airflow/dagster 2+ years experience using Tableau to build impactful reports and dashboards Experience working with dbt is a plus Strong analytical and problem-solving skills. Cross-functional team collaboration experience in a dynamic environment. Proven track record of navigating ambiguity, prioritizing needs, and solving impactful business problems. Excellent written and verbal communication skills for technical and non-technical audiences. Empathy-driven, supporting team success. Remote work experience with a U.S based team is preferred. Equal Opportunity Statement We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of age, ancestry, citizenship, color, ethnicity, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or invisible disability status, political affiliation, veteran status, race, religion, or sexual orientation. Did you read the requirements as a checklist and not tick every box? Don't rule yourself out! If this role resonates with you, hit the ‘apply’ button. Show more Show less
Posted 1 week ago
5.0 years
4 - 9 Lacs
Bengaluru
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ͏ Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications: Minimum of bachelor’s degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular ͏ ͏ ͏ Mandatory Skills: Python for Insights. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
25.0 years
0 Lacs
Kochi, Kerala, India
On-site
Company Overview Milestone Technologies is a global IT managed services firm that partners with organizations to scale their technology, infrastructure and services to drive specific business outcomes such as digital transformation, innovation, and operational agility. Milestone is focused on building an employee-first, performance-based culture and for over 25 years, we have a demonstrated history of supporting category-defining enterprise clients that are growing ahead of the market. The company specializes in providing solutions across Application Services and Consulting, Digital Product Engineering, Digital Workplace Services, Private Cloud Services, AI/Automation, and ServiceNow. Milestone culture is built to provide a collaborative, inclusive environment that supports employees and empowers them to reach their full potential. Our seasoned professionals deliver services based on Milestone’s best practices and service delivery framework. By leveraging our vast knowledge base to execute initiatives, we deliver both short-term and long-term value to our clients and apply continuous service improvement to deliver transformational benefits to IT. With Intelligent Automation, Milestone helps businesses further accelerate their IT transformation. The result is a sharper focus on business objectives and a dramatic improvement in employee productivity. Through our key technology partnerships and our people-first approach, Milestone continues to deliver industry-leading innovation to our clients. With more than 3,000 employees serving over 200 companies worldwide, we are following our mission of revolutionizing the way IT is deployed. Job Overview Job Summary: We are seeking a highly experienced and visionary Databricks Data Architect with over 14 years in data engineering and architecture, including deep hands-on experience in designing and scaling Lakehouse architectures using Databricks . The ideal candidate will possess deep expertise across data modeling, data governance, real-time and batch processing, and cloud-native analytics using the Databricks platform. You will lead the strategy, design, and implementation of modern data architecture to drive enterprise-wide data initiatives and maximize the value from the Databricks platform. Key Responsibilities Lead the architecture, design, and implementation of scalable and secure Lakehouse solutions using Databricks and Delta Lake. Define and implement data modeling best practices, including medallion architecture (bronze/silver/gold layers). Champion data quality and governance frameworks leveraging Databricks Unity Catalog for metadata, lineage, access control, and auditing. Architect real-time and batch data ingestion pipelines using Apache Spark Structured Streaming, Auto Loader, and Delta Live Tables (DLT). Develop reusable templates, workflows, and libraries for data ingestion, transformation, and consumption across various domains. Collaborate with enterprise data governance and security teams to ensure compliance with regulatory and organizational data standards. Promote self-service analytics and data democratization by enabling business users through Databricks SQL and Power BI/Tableau integrations. Partner with Data Scientists and ML Engineers to enable ML workflows using MLflow, Feature Store, and Databricks Model Serving. Provide architectural leadership for enterprise data platforms, including performance optimization, cost governance, and CI/CD automation in Databricks. Define and drive the adoption of DevOps/MLOps best practices on Databricks using Databricks Repos, Git, Jobs, and Terraform. Mentor and lead engineering teams on modern data platform practices, Spark performance tuning, and efficient Delta Lake optimizations (Z-ordering, OPTIMIZE, VACUUM, etc.). Technical Skills 10+ years in Data Warehousing, Data Architecture, and Enterprise ETL design. 5+ years hands-on experience with Databricks on Azure/AWS/GCP, including advanced Apache Spark and Delta Lake. Strong command of SQL, PySpark, and Spark SQL for large-scale data transformation. Proficiency with Databricks Unity Catalog, Delta Live Tables, Autoloader, DBFS, Jobs, and Workflows. Hands-on experience with Databricks SQL and integration with BI tools (Power BI, Tableau, etc.). Experience implementing CI/CD on Databricks, using tools like Git, Azure DevOps, Terraform, and Databricks Repos. Proficient with streaming architecture using Spark Structured Streaming, Kafka, or Event Hubs/Kinesis. Understanding of ML lifecycle management with MLflow, and experience in deploying MLOps solutions on Databricks. Familiarity with cloud object stores (e.g., AWS S3, Azure Data Lake Gen2) and data lake architectures. Exposure to data cataloging and metadata management using Unity Catalog or third-party tools. Knowledge of orchestration tools like Airflow, Databricks Workflows, or Azure Data Factory. Experience with Docker/Kubernetes for containerization (optional, for cross-platform knowledge). Preferred Certifications (a Plus) Databricks Certified Data Engineer Associate/Professional Databricks Certified Lakehouse Architect Microsoft Certified: Azure Data Engineer / Azure Solutions Architect AWS Certified Data Analytics – Specialty Google Professional Data Engineer Compensation Estimated Pay Range: Exact compensation and offers of employment are dependent on circumstances of each case and will be determined based on job-related knowledge, skills, experience, licenses or certifications, and location. Our Commitment to Diversity & Inclusion At Milestone we strive to create a workplace that reflects the communities we serve and work with, where we all feel empowered to bring our full, authentic selves to work. We know creating a diverse and inclusive culture that champions equity and belonging is not only the right thing to do for our employees but is also critical to our continued success. Milestone Technologies provides equal employment opportunity for all applicants and employees. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, gender, gender identity, marital status, age, disability, veteran status, sexual orientation, national origin, or any other category protected by applicable federal and state law, or local ordinance. Milestone also makes reasonable accommodations for disabled applicants and employees. We welcome the unique background, culture, experiences, knowledge, innovation, self-expression and perspectives you can bring to our global community. Our recruitment team is looking forward to meeting you. Show more Show less
Posted 1 week ago
5.0 years
50 Lacs
Madurai, Tamil Nadu, India
Remote
Experience : 5.00 + years Salary : INR 5000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Precanto) (*Note: This is a requirement for one of Uplers' client - A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams.) What do you need for this opportunity? Must have skills required: async workflows, MLOps, Ray Tune, Data Engineering, MLFlow, Supervised Learning, Time-Series Forecasting, Docker, machine_learning, NLP, Python, SQL A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams. is Looking for: We are a fast-moving startup building AI-driven solutions to the financial planning workflow. We’re looking for a versatile Machine Learning Engineer to join our team and take ownership of building, deploying, and scaling intelligent systems that power our core product. Job Description- Full-time Team: Data & ML Engineering We’re looking for 5+ years of experience as a Machine Learning or Data Engineer (startup experience is a plus) What You Will Do- Build and optimize machine learning models — from regression to time-series forecasting Work with data pipelines and orchestrate training/inference jobs using Ray, Airflow, and Docker Train, tune, and evaluate models using tools like Ray Tune, MLflow, and scikit-learn Design and deploy LLM-powered features and workflows Collaborate closely with product managers to turn ideas into experiments and production-ready solutions Partner with Software and DevOps engineers to build robust ML pipelines and integrate them with the broader platform Basic Skills Proven ability to work creatively and analytically in a problem-solving environment Excellent communication (written and oral) and interpersonal skills Strong understanding of supervised learning and time-series modeling Experience deploying ML models and building automated training/inference pipelines Ability to work cross-functionally in a collaborative and fast-paced environment Comfortable wearing many hats and owning projects end-to-end Write clean, tested, and scalable Python and SQL code Leverage async workflows and cloud-native infrastructure (S3, Docker, etc.) for high-throughput data processing. Advanced Skills Familiarity with MLOps best practices Prior experience with LLM-based features or production-level NLP Experience with LLMs, vector stores, or prompt engineering Contributions to open-source ML or data tools TECH STACK Languages: Python, SQL Frameworks & Tools: scikit-learn, Prophet, pyts, MLflow, Ray, Ray Tune, Jupyter Infra: Docker, Airflow, S3, asyncio, Pydantic How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
5.0 years
50 Lacs
Vellore, Tamil Nadu, India
Remote
Experience : 5.00 + years Salary : INR 5000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Precanto) (*Note: This is a requirement for one of Uplers' client - A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams.) What do you need for this opportunity? Must have skills required: async workflows, MLOps, Ray Tune, Data Engineering, MLFlow, Supervised Learning, Time-Series Forecasting, Docker, machine_learning, NLP, Python, SQL A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams. is Looking for: We are a fast-moving startup building AI-driven solutions to the financial planning workflow. We’re looking for a versatile Machine Learning Engineer to join our team and take ownership of building, deploying, and scaling intelligent systems that power our core product. Job Description- Full-time Team: Data & ML Engineering We’re looking for 5+ years of experience as a Machine Learning or Data Engineer (startup experience is a plus) What You Will Do- Build and optimize machine learning models — from regression to time-series forecasting Work with data pipelines and orchestrate training/inference jobs using Ray, Airflow, and Docker Train, tune, and evaluate models using tools like Ray Tune, MLflow, and scikit-learn Design and deploy LLM-powered features and workflows Collaborate closely with product managers to turn ideas into experiments and production-ready solutions Partner with Software and DevOps engineers to build robust ML pipelines and integrate them with the broader platform Basic Skills Proven ability to work creatively and analytically in a problem-solving environment Excellent communication (written and oral) and interpersonal skills Strong understanding of supervised learning and time-series modeling Experience deploying ML models and building automated training/inference pipelines Ability to work cross-functionally in a collaborative and fast-paced environment Comfortable wearing many hats and owning projects end-to-end Write clean, tested, and scalable Python and SQL code Leverage async workflows and cloud-native infrastructure (S3, Docker, etc.) for high-throughput data processing. Advanced Skills Familiarity with MLOps best practices Prior experience with LLM-based features or production-level NLP Experience with LLMs, vector stores, or prompt engineering Contributions to open-source ML or data tools TECH STACK Languages: Python, SQL Frameworks & Tools: scikit-learn, Prophet, pyts, MLflow, Ray, Ray Tune, Jupyter Infra: Docker, Airflow, S3, asyncio, Pydantic How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
5.0 years
50 Lacs
Chennai, Tamil Nadu, India
Remote
Experience : 5.00 + years Salary : INR 5000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Precanto) (*Note: This is a requirement for one of Uplers' client - A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams.) What do you need for this opportunity? Must have skills required: async workflows, MLOps, Ray Tune, Data Engineering, MLFlow, Supervised Learning, Time-Series Forecasting, Docker, machine_learning, NLP, Python, SQL A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams. is Looking for: We are a fast-moving startup building AI-driven solutions to the financial planning workflow. We’re looking for a versatile Machine Learning Engineer to join our team and take ownership of building, deploying, and scaling intelligent systems that power our core product. Job Description- Full-time Team: Data & ML Engineering We’re looking for 5+ years of experience as a Machine Learning or Data Engineer (startup experience is a plus) What You Will Do- Build and optimize machine learning models — from regression to time-series forecasting Work with data pipelines and orchestrate training/inference jobs using Ray, Airflow, and Docker Train, tune, and evaluate models using tools like Ray Tune, MLflow, and scikit-learn Design and deploy LLM-powered features and workflows Collaborate closely with product managers to turn ideas into experiments and production-ready solutions Partner with Software and DevOps engineers to build robust ML pipelines and integrate them with the broader platform Basic Skills Proven ability to work creatively and analytically in a problem-solving environment Excellent communication (written and oral) and interpersonal skills Strong understanding of supervised learning and time-series modeling Experience deploying ML models and building automated training/inference pipelines Ability to work cross-functionally in a collaborative and fast-paced environment Comfortable wearing many hats and owning projects end-to-end Write clean, tested, and scalable Python and SQL code Leverage async workflows and cloud-native infrastructure (S3, Docker, etc.) for high-throughput data processing. Advanced Skills Familiarity with MLOps best practices Prior experience with LLM-based features or production-level NLP Experience with LLMs, vector stores, or prompt engineering Contributions to open-source ML or data tools TECH STACK Languages: Python, SQL Frameworks & Tools: scikit-learn, Prophet, pyts, MLflow, Ray, Ray Tune, Jupyter Infra: Docker, Airflow, S3, asyncio, Pydantic How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
4 - 4 Lacs
Bengaluru
On-site
Introduction We are looking for 3 years experienced candidates for this role. Responsibilities include: Develop and maintain front-end components using Angular Assist in designing and building backend services using Python (Flask/FastAPI) Integrate APIs and services from systems like Workday, HatchPay, and external payment gateways Write clean, maintainable, and well-documented code aligned with internal standards Support data ingestion, validation, and transformation workflows as part of the Payment Data Hub Collaborate with senior developers on implementation tasks, code reviews, and testing Troubleshoot and debug issues during development and post-deployment (hypercare phase) Participate in Agile development ceremonies and contribute to sprint deliverables Assist in maintaining CI/CD pipelines and version control practices (Git) Follow secure coding practices and contribute to documentation Primary Skills : Proficiency in Python (preferably with Flask or FastAPI) Experience in Angular (or similar JavaScript framework like React) Basic understanding of RESTful APIs and microservices Working knowledge of SQL and data handling Familiarity with Git and standard development workflows Exposure to cloud platforms (preferably GCP or any major cloud provider) Secondary Skills : Exposure to data integration, ETL pipelines, or data lake concepts Experience with CI/CD tools like Jenkins or GitHub Actions Familiarity with Workday or financial/payment systems is a plus Knowledge of Airflow, BigQuery, or Cloud Functions is a plus Assist in implementing role-based access control (RBAC) mechanisms. Experience with authentication and authorization techniques such as OAuth2, JWT Basic understanding of multi-tenant SaaS architecture Experience with logging and monitoring tools like Stackdriver, Datadog, or Grafana Job Details Role: Junior Full stack developer Location : Trivandrum/Bangalore/Kochi Close Date : 13-06-2025 Interested candidates may forward their detailed resumes to Careers@reflectionsinfos.com along with their notice period, current and expected CTC details. This is to notify jobseekers that some fraudsters are promising jobs with Reflections Info Systems for a fee. Please note that no payment is ever sought for jobs in Reflections. We contact our candidates only through our official website or LinkedIn and all employment related mails are sent through the official HR email id. Please contact careers@reflectionsinfos.com for any clarification/ alerts on this subject.
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
On-site
Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Evaluate and integrate new data management technologies and tools Requirements 3-5 years of professional experience in data engineering roles Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred Job Description Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation) Personal Qualities Strong problem-solving skills and attention to detail Excellent communication skills, both written and verbal Ability to work independently and as part of a team Proactive approach to identifying and solving problems
Posted 1 week ago
5.0 years
0 Lacs
Karnataka
On-site
WHO YOU’LL WORK WITH This role is part of the Nike’s Content Technology team within Consumer Product and Innovation (CP&I) organization, working very closely with the globally distributed Engineering and Product teams. This role will roll up to the Director Software Engineering based out of Nike India Tech Centre. WHO WE ARE LOOKING FOR We are looking for experienced Technology focused and hands on Lead Engineer to join our team in Bengaluru, India. As a Senior Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Data Engineering and Business Intelligence initiatives. A data engineer with 5+ years of experience working with cloud-native platforms. Advanced skills in SQL, PySpark, Apache Airflow (or similar workflow management tools), Databricks, and Snowflake. Deep understanding of Spark optimization, Delta Lake, and Medallion architecture. Strong experience in data modeling and data quality practices. Experience with Tableau for data validation and monitoring. Exposure to DevOps practices, CI/CD, Git, and security aspects. Effective mentorship and team collaboration skills. Strong communication skills, able to explain technical concepts clearly. Experience with Kafka or other real-time systems Preferred: Familiarity with ML/GenAI integration into pipelines. Databricks Data Engineer certification. WHAT YOU’LL WORK ON Own and optimize large-scale ETL/ELT pipelines and reusable frameworks. Collaborate with cross-functional teams to translate business requirements into technical solutions. Guide junior engineers through code reviews and design discussions. Monitor data quality, availability, and system performance. Lead CI/CD implementation and improve workflow automation.
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. American Express has embarked on an exciting transformation driven by an energetic new team of an inclusive pool of candidates to give all an equal opportunity for growth. Service Operations is responsible for providing reliable platforms for hundreds of critical applications and utilities within American Express Primary focus is to provide technical expertise and tooling to ensure the highest level of reliability and availability for critical applications. Able to provide consultation and strategic recommendations by quickly assessing and remediating complex availability issues. Responsible for driving automation, efficiencies to increase quality, availability, and auto-healing of complex processes. Responsibilities include, but not limited to: The Ideal candidate will be responsible for Designing, Developing and maintaining data pipelines. Serving as a core member of an agile team that drives user story analysis and elaboration, designs and develops responsive web applications using the best engineering practices You will closely work with data scientists, analysts and other partners to ensure the flawless flow of data. You will be Building and optimize reports for analytical and business purpose. Monitor and solve data pipelines issues to ensure smooth operation. Implementing data quality checks and validation process to ensure the accuracy completeness and consistency of data Implementing data governance policies , access controls , and security measures to protect critical data and ensure compliance. Developing deep understanding of integrations with other systems and platforms within the supported domains. Bring a culture of innovation, ideas, and continuous improvement. Challenging status quo, demonstrate risk taking, and implement creative ideas Lead your own time, and work well both independently and as part of a team. Adopt emerging standards while promoting best practices and consistent framework usage. Work with Product Owners to define requirements for new features and plan increments of work. Minimum Qualifications BS or MS degree in computer science, computer engineering, or other technical subject area or equivalent 0 to 3 years of work experience At least 1 to 3 years of hands-on experience with SQL, including schema design, query optimization and performance tuning. Experience with distributed computing frameworks like Hadoop,Hive,Spark for processing large scale data sets. Proficiency in any of the programming language python, pyspark for building data pipeline and automation scripts. Understanding of cloud computing and exposure to Big Query and Airflow to execute DAGs. knowledge of CICD, GIT commands and deployment process. Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize data processing workflows Excellent communication and collaboration skills. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Karnataka
On-site
WHO YOU’LL WORK WITH This role is part of the Nike’s Content Technology team within Consumer Product and Innovation (CP&I) organization, working very closely with the globally distributed Engineering and Product teams. This role will roll up to the Director Software Engineering based out of Nike India Tech Centre. WHO WE ARE LOOKING FOR We are looking for experienced Technology focused and hands on Lead Engineer to join our team in Bengaluru, India. As a Senior Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Data Engineering and Business Intelligence initiatives. A data engineer with 2+ years of experience in data engineering. Proficient in SQL, Python, PySpark, and Apache Airflow (or similar workflow management tools). Hands-on experience with Databricks, Snowflake, and cloud platforms (AWS/GCP/Azure). Good understanding of Spark, Delta Lake, Medallion architecture, and ETL/ELT processes. Solid data modeling and data profiling skills. Familiarity with Agile methodologies (Scrum/Kanban). Awareness of DevOps practices in data engineering (automated testing, security administration, workflow orchestration) Exposure to Kafka or real-time data processing Strong communication and collaboration skills. Preferred: familiarity with Tableau or similar BI tools exposure to GenAI/ML pipelines Nice to have: Databricks certifications for data engineer, developer, or Apache Spark. WHAT YOU’LL WORK ON Build and maintain ETL/ELT pipelines and reusable data components. Collaborate with peers and stakeholders to gather data requirements. Participate in code reviews and contribute to quality improvements. Monitor and troubleshoot data pipelines for performance and reliability. Support CI/CD practices in data engineering workflows.
Posted 1 week ago
5.0 years
0 Lacs
Karnataka
On-site
WHO YOU’LL WORK WITH This role is part of the Nike’s Content Technology team within Consumer Product and Innovation (CP&I) organization, working very closely with the globally distributed Engineering and Product teams. This role will roll up to the Director Software Engineering based out of Nike India Tech Centre. WHO WE ARE LOOKING FOR We are looking for experienced Technology focused and hands on Lead Engineer to join our team in Bengaluru, India. As a Senior Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Data Engineering and Business Intelligence initiatives. A data engineer with 5+ years of experience working with cloud-native platforms. Advanced skills in SQL, PySpark, Apache Airflow (or similar workflow management tools), Databricks, and Snowflake. Deep understanding of Spark optimization, Delta Lake, and Medallion architecture. Strong experience in data modeling and data quality practices. Experience with Tableau for data validation and monitoring. Exposure to DevOps practices, CI/CD, Git, and security aspects. Effective mentorship and team collaboration skills. Strong communication skills, able to explain technical concepts clearly. Experience with Kafka or other real-time systems, Preferred: Familiarity with ML/GenAI integration into pipelines. Databricks Data Engineer certification. WHAT YOU’LL WORK ON Own and optimize large-scale ETL/ELT pipelines and reusable frameworks. Collaborate with cross-functional teams to translate business requirements into technical solutions. Guide junior engineers through code reviews and design discussions. Monitor data quality, availability, and system performance. Lead CI/CD implementation and improve workflow automation.
Posted 1 week ago
12.0 years
0 Lacs
Karnataka
On-site
WHO YOU’LL WORK WITH This role is part of the Nike’s Content Technology team within Consumer Product and Innovation (CP&I) organization, working very closely with the globally distributed Engineering and Product teams. This role will roll up to the Director Software Engineering based out of Nike India Tech Centre. WHO WE ARE LOOKING FOR We are looking for experienced Technology focused and hands on Engineering Manager to join our team in Bengaluru, India. The engineering manager supports a squad of world class engineers, manages delivery priorities, and ensures high quality and well architected solutions are delivered on time and within budget. A seasoned data engineering leader with 12+ years of experience in data engineering and at least 3 years in people or technical leadership roles. Proven expertise in Databricks, Snowflake, Spark, Delta Lake, Airflow, and cloud platforms (preferably AWS). Strong background in integrating ML/GenAI into data platforms. Deep understanding of data modelling, performance tuning, platform scalability, Data Lakes, Medallion architecture, and Delta Lakes. Experience with CI/CD pipelines, version control (Git), and DevOps practices in a data engineering context. Excellent leadership, mentoring, and team-building skills. Strong problem-solving abilities and the capability to design solutions for complex data challenges. Effective communicator, able to translate complex technical concepts for non-technical stakeholders and work cross-functionally. Bachelor’s degree or higher in Computer Science, Engineering, or a related field, or equivalent experience. Preferred: Experience with Kafka/Kinesis or real-time data streaming, Databricks certifications, and familiarity with Tableau or other analytical tools. WHAT YOU’LL WORK ON Lead and grow a high-performing team of data engineers and analyst Define and drive the team’s technical roadmap in alignment with business goals. Oversee the development of scalable data platforms, architecture, and best practices. Drive the implementation of governance, CI/CD, and operational excellence. Coach engineers, manage performance, and foster a collaborative culture. Partner with cross-functional leaders to align priorities and execution. Ensure technical delivery and solution quality across initiatives.
Posted 1 week ago
6.0 years
0 Lacs
Karnataka
On-site
WHO YOU’LL WORK WITH This role is part of the Nike’s Content Technology team within Consumer Product and Innovation (CP&I) organization, working very closely with the globally distributed Engineering and Product teams. This role will roll up to the Director Software Engineering based out of Nike India Tech Centre. WHO WE ARE LOOKING FOR We are looking for experienced Technology focused and hands on Lead Engineer to join our team in Bengaluru, India. As a Lead Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions. As a Lead Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Advanced Analytics and Business Intelligence initiatives. Bachelor’s degree or higher in Computer Science, Engineering, or a related field, or equivalent experience. An experienced data engineer with 6+ years in data engineering and at least 2 years in technical leadership roles. Deep hands-on expertise with Databricks, Snowflake, Spark, Delta Lake, and Apache Airflow (or similar workflow management tools). Expert in SQL and Spark optimization techniques. Strong command of Medallion architecture and data modeling. Experience integrating ML/GenAI pipelines is a plus. Exposure to Tableau or other BI tools. Strong leadership skills with a proven ability to lead and mentor data engineering teams. Excellent problem-solving skills and the ability to design solutions for complex data challenges. Skilled in collaborating cross-functionally and mentoring engineers. Effective communicator, able to work cross-functionally and translate technical concepts for non-technical stakeholders. Preferred: Experience with Kafka/Kinesis or real-time data processing, and certifications in Databricks or Spark (strongly preferred). WHAT YOU’LL WORK ON Architect and lead the development of scalable data pipelines and platform features. Translate business needs into technical solutions in collaboration with product teams, business stakeholders, and data science teams. Enforce best practices across teams, including governance, quality, and coding standards. Lead code reviews, pair programming, and mentoring. Troubleshoot complex systems and optimize distributed pipelines. Automate deployments using CI/CD and DevOps practices.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.