Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Job Title: Automation Tester - Selenium, python, databricks Candidate Specification: 7 + years, Immediate to 30 days. Job Description Experience with Automated Testing. Ability to code and read a programming language (Python). Experience in pytest, selenium(python). Experience working with large datasets and complex data environments. Experience with airflow, Databricks, Data lake, Pyspark. Knowledge and working experience in Agile methodologies. Experience in CI/CD/CT methodology. Experience in Test methodologies. Skills Required RoleAutomation Tester Industry TypeIT/ Computers - Software Functional Area Required Education B Tech Employment TypeFull Time, Permanent Key Skills SELENIUM PYTHON DATABRICKS Other Information Job CodeGO/JC/100/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Job Title: Automation Tester - Selenium, python, databricks Candidate Specification: 7 + years, Immediate to 30 days. Job Description Experience with Automated Testing. Ability to code and read a programming language (Python). Experience in pytest, selenium(python). Experience working with large datasets and complex data environments. Experience with airflow, Databricks, Data lake, Pyspark. Knowledge and working experience in Agile methodologies. Experience in CI/CD/CT methodology. Experience in Test methodologies. Skills Required RoleAutomation Tester Industry TypeIT/ Computers - Software Functional Area Required Education B Tech Employment TypeFull Time, Permanent Key Skills SELENIUM PYTHON DATABRICKS Other Information Job CodeGO/JC/100/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We're looking for a dynamic Senior Data Scientist to join our team, working alongside our founders, financial analysts, product managers, and engineers. This is your chance to get hands-on with massive datasets, crafting the metrics that matter to institutional investors worldwide. For those who are curious and passionate about financial technologies, this is a great opportunity to work at an extremely well-capitalized startup with a proven team of senior financial and tech industry talent to build the future of investing tools. Responsibilities: Work with large and complex data sets and derive investment-relevant metrics in close partnership with financial analysts and engineers. Apply knowledge of statistics, programming, data modeling, simulation, and advanced mathematics to recognize patterns, identify opportunities, pose business questions, and make valuable discoveries leading to the development of fundamental metrics needed to evaluate various assets. Implement the risk factor model using advanced statistical and programming techniques, ensuring its performance aligns with the proposed design. Conduct rigorous testing and validation of the model to ensure its effectiveness in identifying, quantifying, and managing various risk factors. Build customer-facing metrics and dashboards. Work closely with analysts, engineers, and Product Managers and provide feedback as we develop our data analytics and research platform. Requirements: Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical experience (or) degree in an analytical field (e. g. Computer Science, Engineering, Mathematics, Statistics, Operations Research, Management Science). IIT preferred. 3+ years of experience with data analysis and metrics development. 3+ years of experience analyzing and interpreting data, drawing conclusions, defining recommended actions, and reporting results across stakeholders. 2+ years of experience writing SQL queries. 2+ years of experience scripting in Python. Interested in learning new technologies to solve customer needs with lots of creative freedom. Strong communication skills and business acumen. Self-starter, motivated by an interest in developing the best possible solutions to problems. Strong SQL and Python programming skills. Experience with Airflow, Google BigQuery, and Clickhouse is a plus. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Job Title: Senior Full Stack Developer – Python + React JS - Chennai Job Overview Candidates with 5+years of experience in python development with 3+ years of experience in React/Next.js preferred. Key Responsibilities Proven delivery of production APIs using FastAPI (or Flask/Django Rest Framework) Deep SQL skills and hands‑on experience with SQL Alchemy (or comparable ORM) and Microsoft SQL Server (or similar) Comfortable with async programming, task queues (Celery/RQ) or workflow engines (Prefect/Airflow) Graph QL experience: query design, schema stitching, client integration, etc. Strong testing culture: pytest, tox, coverage, mocking, etc. CI/CD experience: GitHub Actions / Azure DevOps, Docker build & registry, automated quality gates Skills Should be experienced with Python- fast api or flask api, React JS,Next JS, Python, Fast API, Graph QL, Azure Devops. Skills Required RoleSenior Full Stack Developer – Python + React JS - Chennai Industry TypeIT/ Computers - Software Functional Area Required Education B Tech Employment TypeFull Time, Permanent Key Skills AZURE DEV OPS PYTHON REACT JS SQL Other Information Job CodeGO/JC/156/2025 Recruiter NameBrindha Kamaraj Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
What You'll Do We are looking for an experienced Senior Data Engineer to join our Data Operations team. The ideal candidate will have expertise in Python, Snowflake, SQL, modern ETL tools, and business intelligence platforms such as Power BI. You will require experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs. You will build and maintain data pipelines, developing data models, and ensuring seamless data integrations that support business analytics and reporting. The role requires flexibility to collaborate in US time zones as needed. You will report to Manager, Finance Applications. What Your Responsibilities Will Be Design, develop, and maintain scalable data pipelines and workflows using modern ETL tools and Python. Build and optimize SQL queries and data models on Snowflake to support analytics and reporting needs. Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors. You will develop and support dashboards and reports using Power BI and other reporting tools. Work with data analysts, business users, and other engineering teams to gather requirements and deliver high-quality solutions. Ensure data quality, accuracy, and consistency across systems and datasets. Write clean, well-documented, and testable code with a focus on performance and reliability. Participate in peer code reviews and contribute to best practices in data engineering. Be available for meetings and collaboration in US time zones as required. What You’ll Need To Be Successful You have 5+ years' experience in data engineering field, with deep SQL knowledge. Experience in Snowflake - SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow) is must. Proficiency in Python for data transformation and scripting. Proficiency in writing complex SQL queries, Stored Procedures. Experience in Data Warehouse, data modeling and ETL design concepts. Have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP...etc. Knowledge of AWS technologies (EC2, S3, RDS, Redshift...etc.) Excellent communication skills, with the ability to translate technical issues for non-technical stakeholders. Flexibility to work during US business hours as required for team meetings and collaboration. How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re Avalara. We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business . Our growth is real, and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. Ownership and achievement go hand in hand here. We instill passion in our people through the trust we place in them. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know. Show more Show less
Posted 4 days ago
14.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. The Manager of Data Engineering is a leader with a passion for leveraging technical solutions to address business challenges. This role requires deep expertise in agile development practices and a commitment to extreme ownership of software products and platforms. The ideal candidate will collaborate closely with Product Managers and Engineering teams to drive both transformational and operational initiatives. Key Responsibilities Would Include Motivate a high-performing team to achieve outstanding results while fostering individual growth and development. Proven track record of leading teams that have successfully built and launched products in highly scalable growth markets. Experience in building teams and leading organizational change within a fast-paced and highly regarded technology company. Partners with business product leadership to develop strategies and technology roadmaps. Positive client engagement at senior levels, managing and navigating relationships with peers, business line leaders, and engagement across the Protect matrix of services. IT solution development that aligns with the company's strategic architecture. Manage vendor solutions and deliverables. Responsible for the delivery of complex Data Engineering projects, including data marts, data lakes, data interchanges, and data warehouses. Responsible for managing the costs, optimizations, and overall cost strategies for cloud computing in your domain. Experience Job Qualifications: Total experience must be in range of 14-16 years. 10+ years of progressive experience in Data Engineering. Must be having 5+ years of exp. in MSSQL, Data Flow, GCP and Snowflake 5 – 8 years of experience leading technology teams. Proven success in delivering complex, high-impact projects with measurable business outcomes. Minimum 3 years of support in cloud-based data technologies. Education A BS in Computer Science or an equivalent degree is highly preferred. Technical Expertise Experience developing or leading development teams using the following technologies (or similar): Google Dataflow, Google Airflow, Microsoft Sql Server Integration Services (SSIS), PostgreSQL, Google BigQuery, Google Cloud Platform, REST Services, Data Visualization tools (e.g. PowerBI, SSRS) Industry Experience: Background in the Property and Casualty Insurance industry is preferred. Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less
Posted 4 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Location -Bangalore(Whitefield) Work Mode - WFO during probation(3 Months), later Hybrid Role overview Experience 1. Ten plus years’ experience in software development with at least the last few years in a leadership role with progressively increasing responsibilities 2. Extensive experience in the following areas a. C#, .Net b. Designing and building cloud-native solutions (Azure, AWS, Google Cloud Platform) c. Infrastructure as Code tools (Terraform, Pulumi, cloud-specific IaC tools) d. Configuration management tools (Ansible, Chef, Salt Stack) e. Containerization and orchestration technologies (Docker, Kubernetes) f. Native and third-party Databricks integrations (Delta Live Tables, Auto Loader,Databricks Workflows / Apache Airflow, Unity Catalog) 3. Extensive experience in Azure 4. Experience designing and implementing data security and governance platform adhering to compliance standards (HIPPA, SOC 2) preferred Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Location: Remote/Hybrid (India-based preferred) Type: Full-Time Must Haves (Don’t Apply If You Miss Any) 3+ years experience in Data Engineering Proven hands-on with ETL pipelines (end-to-end ownership) AWS Resources: Deep experience with EC2, Athena, Lambda, Step Functions (non-negotiable; critical to the role) Strong with MySQL (not negotiable) Docker (setup, deployment, troubleshooting) Good To Have (Adds Major Value) Airflow or any modern orchestration tool PySpark experience Python Ecosystem SQL Alchemy DuckDB PyArrow Pandas Numpy DLT (Data Load Tool). About You You’re a builder, not just a maintainer. You can work independently but communicate crisply. You thrive in fast-moving, startup environments. You care about ownership and impact, not just code. Include the Code word Red Panda in your message application, so that we know you have read this section. What You’ll Do Architect, build, and optimize robust data pipelines and workflows Own AWS resource configuration, optimization, and troubleshooting Collaborate with product and engineering teams to deliver business impact fast Automate and scale data processes—no manual work culture Shape the data foundation for real business decisions Cut to the chase. Only serious, relevant applicants will be considered. Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 4 days ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Your Team Responsibilities MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Your Skills And Experience That Will Help You Excel Prior senior Software Architecture roles Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. Drive the development of conceptual, logical, and physical data models aligned with business requirements. Lead the implementation and optimization of data technologies, including Apache Spark. Experience with one of the table formats, such as Delta, Iceberg. Strong hands-on experience in data architecture, database design, and data modeling. Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Ability to dive into details, hands on technologist with strong core computer science fundamentals. Strong preference for financial services experience Proven leadership of large-scale distributed software teams that have delivered great products on deadline Experience in a modern iterative software development methodology Experience with globally distributed teams and business partners Experience in building and maintaining applications that are mission critical for customers M.S. in Computer Science, Management Information Systems or related engineering field 15+ years of software engineering experience Demonstrated consensus builder and collegial peer About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less
Posted 4 days ago
5.0 - 9.0 years
19 - 23 Lacs
Mumbai
Work from Office
Overview MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Responsibilities Engages technical teams and business stakeholders to discuss and propose technical approaches to meet current and future needs • Defines the technical target state of the product and drives achievement of the strategy • As the Lead Architect you will be responsible for leading the design, development, and maintenance of our data architecture, ensuring scalability, efficiency, and reliability. • Create and maintain comprehensive documentation for the architecture, processes, and best practices including Architecture Decision Records (ADRs). • Evaluates recommendations and provides feedback on new technologies • Develops secure and high-quality production code, and reviews and debugs code written by others Informaon Classificaon: GENERAL • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems • Collaborating with a cross functional team to draft, implement and adapt the overall architecture of our products and support infrastructure in conjunction with software development managers, and product management teams • Staying abreast of new technologies and issues in the software-as-a-service industry, including current technologies, platforms, standards and methodologies • Being actively engaged in setting technology standards that impact the company and its offerings • Ensuring the knowledge sharing of engineering best practices across departments; and developing and monitoring technical standards to ensure adherence to them. Qualifications Prior senior Software Architecture roles • Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. • Drive the development of conceptual, logical, and physical data models aligned with business requirements. • Lead the implementation and optimization of data technologies, including Apache Spark. • Experience with one of the table formats, such as Delta, Iceberg. • Strong hands-on experience in data architecture, database design, and data modeling. • Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. • Experience with cloud platforms such as AWS, Azure, or Google Cloud. • Ability to dive into details, hands on technologist with strong core computer science fundamentals. • Strong preference for financial services experience • Proven leadership of large-scale distributed software teams that have delivered great products on deadline • Experience in a modern iterative software development methodology • Experience with globally distributed teams and business partners • Experience in building and maintaining applications that are mission critical for customers • M.S. in Computer Science, Management Information Systems or related engineering field • 15+ years of software engineering experience • Demonstrated consensus builder and collegial peer What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 4 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Data Ops Capability Deployment - Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new solutions/frameworks/techniques and the improvement of processes and workflow for Enterprise Data function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. The primary purpose of this role is to perform data analytics and data analysis across different asset classes, and to build data science/Tooling capabilities within the team. This will involve working closely with the wider Enterprise Data team, in particular the front to back leads to deliver business priorities. The following role is within B & I Data Capabilities team within the Enterprise Data. The team manages the Data quality/Metrics/Controls program in addition to a broad remit to implement and embed improved data governance and data management practices throughout the region. The Data quality program is centered on enhancing Citi’s approach to data risk and addressing regulatory commitments in this area. Key Responsibilities: Hands on with data engineering background and have thorough understanding of Distributed Data platforms and Cloud services. Sound understanding of data architecture and data integration with enterprise applications Research and evaluate new data technologies, data mesh architecture and self-service data platforms Work closely with Enterprise Architecture Team on the definition and refinement of overall data strategy Should be able to address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities. Ability to perform complex data analytics (data cleansing, transformation, joins, aggregation etc.) on large complex datasets. Build analytics dashboards & data science capabilities for Enterprise Data platforms. Communicate complicated findings and propose solutions to a variety of stakeholders. Understanding business and functional requirements provided by business analysts and convert into technical design documents. Work closely with cross-functional teams e.g. Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Control and Production Support. Prepare handover documents and manage SIT, UAT and Implementation. Demonstrate an in-depth understanding of how the development function integrates within overall business/technology to achieve objectives; requires a good understanding of the banking industry. Performs other duties and functions as assigned. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Skills & Qualifications 10 + years of active development background and experience in Financial Services or Finance IT is a required. Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools Hands on experience for ETL using PySpark on distributed platforms along with data ingestion, Spark optimization, resource utilization, capacity planning & batch orchestration. In depth understanding of Hive, HDFS, Airflow, job scheduler Strong programming skills in Python with experience in data manipulation and analysis libraries (Pandas, Numpy) Should be able to write complex SQL/Stored Procs Should have worked on DevOps, Jenkins/Lightspeed, Git, CoPilot. Strong knowledge in one or more of the BI visualization tools such as Tableau, PowerBI. Proven experience in implementing Datalake/Datawarehouse for enterprise use cases. Exposure to analytical tools and AI/ML is desired. Education: Bachelor's/University degree, master's degree in information systems, Business Analysis / Computer Science. ------------------------------------------------------ Job Family Group: Data Governance ------------------------------------------------------ Job Family: Data Governance Foundation ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role : MLOps Engineer Location - Chennai - CKC Mode of Interview - In Person Data - 7th June 2025 (Saturday) Key Words -Skillset AWS SageMaker, Azure ML Studio, GCP Vertex AI PySpark, Azure Databricks MLFlow, KubeFlow, AirFlow, Github Actions, AWS CodePipeline Kubernetes, AKS, Terraform, Fast API Responsibilities Model Deployment, Model Monitoring, Model Retraining Deployment pipeline, Inference pipeline, Monitoring pipeline, Retraining pipeline Drift Detection, Data Drift, Model Drift Experiment Tracking MLOps Architecture REST API publishing Job Responsibilities Research and implement MLOps tools, frameworks and platforms for our Data Science projects. Work on a backlog of activities to raise MLOps maturity in the organization. Proactively introduce a modern, agile and automated approach to Data Science. Conduct internal training and presentations about MLOps tools’ benefits and usage. Required Experience And Qualifications Wide experience with Kubernetes. Experience in operationalization of Data Science projects (MLOps) using at least one of the popular frameworks or platforms (e.g. Kubeflow, AWS Sagemaker, Google AI Platform, Azure Machine Learning, DataRobot, DKube). Good understanding of ML and AI concepts. Hands-on experience in ML model development. Proficiency in Python used both for ML and automation tasks. Good knowledge of Bash and Unix command line toolkit. Experience in CI/CD/CT pipelines implementation. Experience with cloud platforms - preferably AWS - would be an advantage. Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
About This Role Aladdin Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin. Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability. Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role. And Ideal candidate would have At least 4+ years’ experience as a data engineer Experience in SQL, Sybase, Linux is a must Experience coding in two of these languages for server side/data processing is required Java, Python, C++ 2+ years experience using modern data stack (spark, snowflake, Big Query etc.) on cloud platforms (Azure, GCP, AWS) Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) Experience with Database Modeling, Normalization techniques Experience with object-oriented design patterns Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps Experience with Agile development concepts and related tools Ability to trouble shoot and fix performance issues across the codebase and database queries Excellent written and verbal communication skills Ability to operate in a fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances BA/BS or equivalent practical experience Skills That Would Be a Plus Perl, ETL tools (Informatica, Talend, dbt etc.) Experience with Snowflake or other Cloud Data warehousing products Exposure with Workflow management tools such as Airflow Exposure to messaging platforms such as Kafka Exposure to NoSQL platforms such as Cassandra, MongoDB Building and Delivering REST APIs Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
🚀 We're Hiring: Senior Software Engineer – GCP | Python | Angular We’re looking for a highly skilled and passionate Software Engineer to join our fast-paced, product-focused engineering team. In this role, you’ll be involved in end-to-end development — from design and implementation to testing, deployment, and support. If you thrive in a modern cloud-native, CI/CD-driven development environment and enjoy working on impactful features across the stack, we’d love to hear from you! 📍 Location: Chennai 💼 Join a cutting-edge digital team powering innovation for a globally renowned automotive and mobility leader. 🔧 Key Skills Required : Languages & Frameworks : Python, Java, JavaScript (Node.js), Angular, RESTful APIs Cloud & DevOps : Google Cloud Platform (GCP), Cloud Run, BigQuery, Git, Jenkins, CI/CD Data & Infrastructure : Dataflow, Terraform, Airflow Testing & Best Practices : Jest, Mocha, TDD, Clean Code, Design Patterns 👤 Experience & Qualifications : 5+ years of professional software development experience Bachelor’s degree in Computer Science, Engineering, or related field Experience building scalable full-stack solutions in a cloud environment Strong understanding of Agile, CI/CD, and DevOps practices ✨ Why Join Us? Work on cutting-edge tech including LLM integrations Be part of a team that values quality, ownership, and innovation Collaborate across product, engineering, and DevOps in a cloud-native setup 📩 Interested? Drop your profile or DM for a quick conversation. 📌 Immediate to 30 days joiners preferred #Hiring #SoftwareEngineer #FullStackDeveloper #Python #Angular #GCP #CloudRun #BigQuery #DevOps #LLM #CI/CD #ImmediateJoiners #ChennaiJobs #GoogleCloud Show more Show less
Posted 4 days ago
3.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
dunnhumby is the global leader in Customer Data Science, empowering businesses everywhere to compete and thrive in the modern data-driven economy. We always put the Customer First. Our mission: to enable businesses to grow and reimagine themselves by becoming advocates and champions for their Customers. With deep heritage and expertise in retail – one of the world’s most competitive markets, with a deluge of multi-dimensional data – dunnhumby today enables businesses all over the world, across industries, to be Customer First. dunnhumby employs nearly 2,500 experts in offices throughout Europe, Asia, Africa, and the Americas working for transformative, iconic brands such as Tesco, Coca-Cola, Meijer, Procter & Gamble and Metro. Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21st century human experience – then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo. We’re looking for a Big Data Engineer to join our dh strategic team of Loyality and Personalisation which builds products the retailer can use to find the optimal customer segments and send personalised offers and digital recommendations to the consumer. These products are strategic assets to retailer to improve the loyality of their consumer. by that these products are very important for retailers and therefore for dunnhumby What We Expect From You: 3 to 6 years of experience in software development using Python. - Hands on experience in Python OOPS, Design patterns, Dependency Injection, data libraries(Panda), data structures - Exposure to Spark: PySpark, Architecture of Spark, Best practices to optimize jobs - Experience in Hadoop ecosystem: HDFS, Hive, or YARN - Experience of Orchestration tools: Airflow, Argo workflows, Kubernetes - Experience on Cloud native services(GCP/Azure/AWS) preferable GCP - Database knowledge of: SQL, NoSQL - Hands on expsoure to CI/CD pipelines for data engineering workflows - Testing: pytest for unit testing. pytest-spark to create a test Spark Session. Spark UI for Performance tuning & monitoring Good To Have: - Scala What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. Everyone’s invited. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) Show more Show less
Posted 4 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are seeking a highly experienced and customer-focused Presales Architect to join our Solution Engineering team. The ideal candidate will have a strong background in AWS IaaS, PaaS, and SaaS services , deep expertise in cloud architecture , and solid exposure to data platforms , including Amazon Redshift , AI/ML workloads , and modern data architectures . Familiarity with Azure and Google Cloud Platform (GCP) is a strong advantage. This role is a strategic blend of technical solutioning , customer engagement , and sales support , playing a critical role in the pre-sales cycle by understanding customer requirements, designing innovative solutions, and aligning them with the company’s service offerings. Key Responsibilities Pre-Sales and Solutioning: Engage with enterprise customers to understand their technical requirements and business objectives. Architect end-to-end cloud solutions on AWS , covering compute, storage, networking, DevOps, and security. Develop compelling solution proposals, high-level designs, and reference architectures that address customer needs. Support RFI/RFP responses, create technical documentation, and deliver presentations and demos to technical and non-technical audiences. Collaborate with Sales, Delivery, and Product teams to ensure alignment of proposed solutions with client expectations. Conduct technical workshops, proof of concepts (PoCs), and technical validations. Technical Expertise Deep hands-on knowledge and architecture experience with AWS services : IaaS: EC2, VPC, S3, EBS, ELB, Auto Scaling, etc. PaaS: RDS, Lambda, API Gateway, Fargate, DynamoDB, Aurora, Step Functions. SaaS & Security: AWS Organizations, IAM, AWS WAF, CloudTrail, GuardDuty. Understanding of multi-cloud strategies ; exposure to Azure and GCP cloud services including hybrid architectures is a plus. Strong knowledge of DevOps practices and tools like Terraform, CloudFormation, Jenkins, GitOps, etc. Proficiency in architecting solutions that meet scalability , availability , and security requirements. Data Platform & AI/ML Experience in designing data lakes , data pipelines , and analytics platforms on AWS. Hands-on expertise in Amazon Redshift , Athena , Glue , EMR , Kinesis , and S3-based architectures . Familiarity with AI/ML solutions using SageMaker , AWS Comprehend , or other ML frameworks. Understanding of data governance , data cataloging , and security best practices for analytics workloads. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. 10+ years of experience in IT, with 5+ years in cloud architecture and pre-sales roles. AWS Certified Solutions Architect – Professional (or equivalent certification) is preferred. Strong presentation skills and experience interacting with CXOs, Architects, and DevOps teams. Ability to translate technical concepts into business value propositions. Excellent communication, proposal writing, and stakeholder management skills. Nice To Have Experience with Azure (e.g., Synapse, AKS, Azure ML) or GCP (e.g., BigQuery, Vertex AI) . Familiarity with industry-specific solutions (e.g., fintech, healthcare, retail cloud transformations). Exposure to AI/ML MLOps pipelines and orchestration tools like Kubeflow , MLflow , or Airflow . Show more Show less
Posted 4 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Valtech, We’ve Got Opportunities To Offer You — For Growth, For Leadership, For Big, World-changing Impact And For, Dare We Say It, Fun. We Are a Global Workforce That Is Committed To Building a World That Works Better For Everyone. And That Starts With Our Kin. That’s Why We’re Proud Of The role Ideal candidates will have strong quantitative backgrounds and analytical discipline. While they will have some demonstrated ability to write code, they will not have learned programming languages for the sake of building their resume, but rather as a means to express their intellectual curiosity and analytical voice. Valtech will provide a platform and training to help them reach their full potential. Valtech is looking to hire a Business Intelligence Analyst / Data Engineer to join our growing capabilities team. If you are innovative, passionate about data and AI technologies, and look to continually learn and enjoy sharing expertise, read on! Role Responsibilities Analyze a collection of raw data sets to create meaningful impact to large enterprise clients while maintaining a high degree of scientific rigor and discipline. Engineer data pipelines and products to help stakeholders make and execute data driven decisions. Communicate analytical findings in an intuitive and visually compelling way. Creating highly visual and interactive dashboards via Tableau, PowerBI, or custom web applications Conducting deep dive analysis and designing KPIs to help guide business decisions and measure success Engineering data infrastructure, software libraries, and APIs supporting BI and ML data pipelines Architecting cloud data platform components enabling the above Building and tracking project timelines, dependences, and risks Gathering stakeholder requirements and conducting technical due diligence toward designing pragmatic data-driven business solutions Minimum Qualifications We want all new hires to succeed in their roles at Valtech. That's why we've outlined the job requirements below. To be considered for this role, it's important that you meet all Minimum Qualifications. If you do not meet all of the Preferred Qualifications, we still encourage you to apply. Proven industry experience executing data engineering, analytics, and/or data science projects or Bachelors/Masters degree in quantitative studies including Engineering, Mathematics, Statistics, Computer Science or computation-intensive Sciences and Humanities. Proficiency (can execute data ingestion to insight) in programmatic languages such as SQL, Python, and R. Preferred Qualifications Proficiency in visualization/reporting tools such as Tableau and PowerBI or programmatic visualization library such as R ggplot2, Python matplotlib/seaborn/bokeh, Javascript D3. Proficiency in big data environments and tools such as Spark, Hive, Impala, Pig, etc. Proficiency with cloud architecture components (AWS, Azure, Google) Proficiency with data pipeline software such as Airflow, Luigi, or Prefect Ability to turn raw data and ambiguous business questions into distilled findings and recommendations for action Experience with statistical and machine learning libraries along with the ability to apply them appropriately to business problems Experience leading and managing technical data/analytics/machine learning projects What We Offer You’ll join an international network of data professionals within our organisation. We support continuous development through our dedicated Academy. If you're looking to push the boundaries of innovation and creativity in a culture that values freedom and responsibility, we encourage you to apply. At Valtech, we’re here to engineer experiences that work and reach every single person. To do this, we are proactive about creating workplaces that work for every person at Valtech. Our goal is to create an equitable workplace which gives people from all backgrounds the support they need to thrive, grow and meet their goals (whatever they may be). You can find out more about what we’re doing to create a Valtech for everyone here. Please do not worry if you do not meet all of the criteria or if you have some gaps in your CV. We’d love to hear from you and see if you’re our next member of the Valtech team! Show more Show less
Posted 4 days ago
0 years
0 Lacs
India
On-site
Job Title : Automation Engineer- Databricks Job Type : Full-time, Contractor Location : Hybrid - Hyderabad | Pune| Delhi About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary: We are seeking a detail-oriented and innovative Automation Engineer- Databricks to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you. Key Responsibilities: Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes. Create detailed and effective test plans and test cases based on technical requirements and business specifications. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage. Document test cases, results, and identified defects; communicate findings clearly to the team. Conduct performance testing to ensure data processing and retrieval meet established benchmarks. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation. Required Skills and Qualifications: Strong proficiency in Python, Selenium, and SQL for developing test automation solutions. Hands-on experience with Databricks, data warehouse, and data lake architectures. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred). Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences. Bachelor’s degree in Computer Science, Information Technology, or a related discipline. Demonstrated problem-solving skills and a collaborative approach to teamwork. Preferred Qualifications: Experience with implementing security and data protection measures in data-driven applications. Ability to integrate user-facing elements with server-side logic for seamless data experiences. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead, Data Engineer Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview The Mastercard Services Technology team is looking for a Lead in Data Engineering, to drive our mission to unlock potential of data assets by consistently innovating, eliminating friction in how we manage big data assets, store those assets, accessibility of data and, enforce standards and principles in the Big Data space both on public cloud and on-premises set up. We are looking for a hands-on, passionate Data Engineer who is not only technically strong in PySpark, cloud platforms, and building modern data architectures, but also deeply committed to learning, growing, and lifting others. The person will play a key role in designing and building scalable data solutions, shaping our engineering culture, and mentoring team members. This is a role for builders and collaborators—engineers who love clean data pipelines, cloud-native design, and helping teammates succeed. Role Design and build scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. Mentor and guide other engineers, sharing knowledge, reviewing code, and fostering a culture of curiosity, growth, and continuous improvement. Create robust, maintainable ETL/ELT pipelines that integrate with diverse systems and serve business-critical use cases. Lead by example—write high-quality, testable code and participate in architecture and design discussions with a long-term view in mind. Decompose complex problems into modular, efficient, and scalable components that align with platform and product goals. Champion best practices in data engineering, including testing, version control, documentation, and performance tuning. Drive collaboration across teams, working closely with product managers, data scientists, and other engineers to deliver high-impact solutions. Support data governance and quality efforts, ensuring data lineage, cataloging, and access management are built into the platform. Continuously learn and apply new technologies, frameworks, and tools to improve team productivity and platform reliability. Own and optimize cloud infrastructure components related to data engineering workflows, storage, processing, and orchestration. Participate in architectural discussions, iteration planning, and feature sizing meetings Adhere to Agile processes and participate actively in agile ceremonies Stakeholder management skills All About You 5+ years of hands-on experience in data engineering with strong PySpark and Python skills. Solid experience designing and implementing data models, pipelines, and batch/stream processing systems. Proven ability to work with cloud platforms (AWS, Azure, or GCP), especially in data-related services like S3, Glue, Data Factory, Databricks, etc. Strong foundation in data modeling, database design, and performance optimization. Understanding of modern data architectures (e.g., lakehouse, medallion) and data lifecycle management. Comfortable with CI/CD practices, version control (e.g., Git), and automated testing. Demonstrated ability to mentor and uplift junior engineers—strong communication and collaboration skills. Bachelor’s degree in computer science, Engineering, or related field—or equivalent hands-on experience. Comfortable working in Agile/Scrum development environments. Curious, adaptable, and driven by problem-solving and continuous improvement. Good To Have Experience integrating heterogeneous systems and building resilient data pipelines across cloud environments. Familiarity with orchestration tools (e.g., Airflow, dbt, Step Functions, etc.). Exposure to data governance tools and practices (e.g., Lake Formation, Purview, or Atlan). Experience with containerization and infrastructure automation (e.g., Docker, Terraform) will be a good addition. Master’s degree, relevant certifications (e.g., AWS Certified Data Analytics, Azure Data Engineer), or demonstrable contributions to open source/data engineering communities will be a bonus. Exposure to machine learning data pipelines or MLOps is a plus. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-251380 Show more Show less
Posted 4 days ago
10.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Summary We are seeking a highly experienced and customer-focused Presales Architect to join our Solution Engineering team. The ideal candidate will have a strong background in AWS IaaS, PaaS, and SaaS services , deep expertise in cloud architecture , and solid exposure to data platforms , including Amazon Redshift , AI/ML workloads , and modern data architectures . Familiarity with Azure and Google Cloud Platform (GCP) is a strong advantage. This role is a strategic blend of technical solutioning , customer engagement , and sales support , playing a critical role in the pre-sales cycle by understanding customer requirements, designing innovative solutions, and aligning them with the company’s service offerings. Key Responsibilities Pre-Sales and Solutioning: Engage with enterprise customers to understand their technical requirements and business objectives. Architect end-to-end cloud solutions on AWS , covering compute, storage, networking, DevOps, and security. Develop compelling solution proposals, high-level designs, and reference architectures that address customer needs. Support RFI/RFP responses, create technical documentation, and deliver presentations and demos to technical and non-technical audiences. Collaborate with Sales, Delivery, and Product teams to ensure alignment of proposed solutions with client expectations. Conduct technical workshops, proof of concepts (PoCs), and technical validations. Technical Expertise Deep hands-on knowledge and architecture experience with AWS services : IaaS: EC2, VPC, S3, EBS, ELB, Auto Scaling, etc. PaaS: RDS, Lambda, API Gateway, Fargate, DynamoDB, Aurora, Step Functions. SaaS & Security: AWS Organizations, IAM, AWS WAF, CloudTrail, GuardDuty. Understanding of multi-cloud strategies ; exposure to Azure and GCP cloud services including hybrid architectures is a plus. Strong knowledge of DevOps practices and tools like Terraform, CloudFormation, Jenkins, GitOps, etc. Proficiency in architecting solutions that meet scalability , availability , and security requirements. Data Platform & AI/ML Experience in designing data lakes , data pipelines , and analytics platforms on AWS. Hands-on expertise in Amazon Redshift , Athena , Glue , EMR , Kinesis , and S3-based architectures . Familiarity with AI/ML solutions using SageMaker , AWS Comprehend , or other ML frameworks. Understanding of data governance , data cataloging , and security best practices for analytics workloads. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. 10+ years of experience in IT, with 5+ years in cloud architecture and pre-sales roles. AWS Certified Solutions Architect – Professional (or equivalent certification) is preferred. Strong presentation skills and experience interacting with CXOs, Architects, and DevOps teams. Ability to translate technical concepts into business value propositions. Excellent communication, proposal writing, and stakeholder management skills. Nice To Have Experience with Azure (e.g., Synapse, AKS, Azure ML) or GCP (e.g., BigQuery, Vertex AI) . Familiarity with industry-specific solutions (e.g., fintech, healthcare, retail cloud transformations). Exposure to AI/ML MLOps pipelines and orchestration tools like Kubeflow , MLflow , or Airflow . Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
About Beco Beco ( letsbeco.com ) is a fast-growing Mumbai-based consumer-goods company on a mission to replace everyday single-use plastics with planet-friendly, bamboo- and plant-based alternatives. From reusable kitchen towels to biodegradable garbage bags, we make sustainable living convenient, affordable and mainstream. Our founding story began with a Mumbai beach clean-up that opened our eyes to the decades-long life of a single plastic wrapper—sparking our commitment to “Be Eco” every day. Our mission: “To craft, support and drive positive change with sustainable & eco-friendly alternatives—one Beco product at a time.” Backed by marquee climate-focused VCs and now 50 + employees, we are scaling rapidly across India’s top marketplaces, retail chains and D2C channels. Why we’re hiring Sustainability at scale demands operational excellence. As volumes explode, we need data-driven, self-learning systems that eliminate manual grunt work, unlock efficiency and delight customers. You will be the first dedicated AI/ML Engineer at Beco—owning the end-to-end automation roadmap across Finance, Marketing, Operations, Supply Chain and Sales. Responsibilities Partner with functional leaders to translate business pain-points into AI/ML solutions and automation opportunities. Own the complete lifecycle: data discovery, cleaning, feature engineering, model selection, training, evaluation, deployment and monitoring. Build robust data pipelines (SQL/BigQuery, Spark) and APIs to integrate models with ERP, CRM and marketing automation stacks. Stand up CI/CD + MLOps (Docker, Kubernetes, Airflow, MLflow, Vertex AI/SageMaker) for repeatable training and one-click releases. Establish data-quality, drift-detection and responsible-AI practices (bias, transparency, privacy). Mentor analysts & engineers; evangelise a culture of experimentation and “fail-fast” learning—core to Beco’s GSD (“Get Sh#!t Done”) values. Must-have Qualifications 3 + years hands-on experience delivering ML, data-science or intelligent-automation projects in production. Proficiency in Python (pandas, scikit-learn, PyTorch/TensorFlow) and SQL; solid grasp of statistics, experimentation and feature engineering. Experience building and scaling ETL/data pipelines on cloud (GCP, AWS or Azure). Familiarity with modern Gen-AI & NLP stacks (OpenAI, Hugging Face, RAG, vector databases). Track record of collaborating with cross-functional stakeholders and shipping iteratively in an agile environment. Nice-to-haves Exposure to e-commerce or FMCG supply-chain data. Knowledge of finance workflows (Reconciliation, AR/AP, FP&A) or RevOps tooling (HubSpot, Salesforce). Experience with vision models (Detectron2, YOLO) and edge deployment. Contributions to open-source ML projects or published papers/blogs. What Success Looks Like After 1 Year 70 % reduction in manual reporting hours across finance and ops. Forecast accuracy > 85 % at SKU level, slashing stock-outs by 30 %. AI chatbot resolves 60 % of tickets end-to-end, with CSAT > 4.7/5. At least two new data-products launched that directly boost topline or margin. Life at Beco Purpose-driven team obsessed with measurable climate impact. Entrepreneurial, accountable, bold” culture—where winning minds precede outside victories. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
India
On-site
Maximize Your Impact with TP Welcome to TP, a global hub of innovation and empowerment, where we redefine the future. With a remarkable €10 billion annual revenue and a global team of 500,000 employees serving 170 countries in over 300 languages, we lead in intelligent, digital-first solutions. As a globally certified Great Place to Work in 72 countries, our culture thrives on diversity, equity, and inclusion. We value your unique perspective and believe that your talent is the missing piece that completes our vision for a brighter, digitally driven tomorrow. The Opportunity The AI Data Engineer designs, develops, and maintains robust data pipelines to support AI data services operations, ensuring smooth ingestion, transformation, and extraction of large, multilingual, and multimodal datasets. This role collaborates with cross-functional teams to optimize data workflows, implement quality checks, and deliver scalable solutions that underpin our analytics and AI/ML initiatives. The Responsibilities Create and manage ETL workflows using Python and relevant libraries (e.g., Pandas, NumPy) for high-volume data processing. Monitor and optimize data workflows to reduce latency, maximize throughput, and ensure high-quality data availability. Work with Platform Operations, QA, and Analytics teams to guarantee seamless data integration and consistent data accuracy. Implement validation processes and address anomalies or performance bottlenecks in real time. Develop REST API integrations and Python scripts to automate data exchanges with internal systems and BI dashboards. Maintain comprehensive technical documentation, data flow diagrams, and best-practice guidelines. The Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field. Relevant coursework in Python programming, database management, or data integration techniques. 3–5 years of professional experience in data engineering, ETL development, or similar roles. Proven track record of building and maintaining scalable data pipelines. Experience working with SQL databases (e.g., MySQL, PostgreSQL) and NoSQL solutions (e.g., MongoDB). AWS Certified Data Analytics – Specialty, Google Cloud Professional Data Engineer, or similar certifications are a plus. Advanced Python proficiency with data libraries (Pandas, NumPy, etc.). Familiarity with ETL/orchestration tools (e.g., Apache Airflow). Understanding of REST APIs and integration frameworks. Experience with version control (Git) and continuous integration practices. Exposure to cloud-based data solutions (AWS, Azure, or GCP) is advantageous. Pre-Employment Screenings By TP policy, employment in this position will be contingent on your successful completion and passage of a comprehensive background check, including global sanctions and watch list screening. Important | Policy on Unsolicited Third-Party Candidate Submissions TP does not accept candidate submissions from unsolicited third parties, including recruiters or headhunters. Applications will not be considered, and no contractual association will be established through such submissions. Diversity, Equity & Inclusion At TP, we are committed to fostering a diverse, equitable, and inclusive workplace. We welcome individuals from all backgrounds and lifestyles and do not discriminate based on gender identity or expression, sexual orientation, race, religion, age, national origin, citizenship, disability, pregnancy status, veteran status, or other differences. Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
We are hiring for immediate joiners. This is a Remote mode job. Job Title: GCP Data Engineer (Google Cloud Platform) Experience : 4 + Years Location: Chennai (Hybrid) Responsibilities Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API 2+Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exporting. Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Delhi, India
On-site
Job Title : GenAI / ML Engineer Function : Research & Development Location : Delhi/Bangalore (3 days in office) About the Company: Elucidata is a TechBio Company headquartered in San Francisco. Our mission is to make life sciences data AI-ready. Elucidata's Elucidata’s LLM-powered platform Polly, helps research teams wrangle, store, manage and analyze large volumes of biomedical data. We are at the forefront of driving GenAI in life sciences R&D across leading BioPharma companies like Pfizer, Janssen, NextGen Jane and many more. We were recognised as the 'Most Innovative Biotech Company, 2024', by Fast Company. We are a 120+ multi-disciplinary team of experts based across the US and India. In September 2022, we raised $16 million in our Series A round led by Eight Roads, F-Prime, and our existing investors Hyperplane and IvyCap. About the Role: We are looking for a GenAI / ML Engineer to join our R&D team and work on cutting-edge applications of LLMs in biomedical data processing . In this role, you'll help build and scale intelligent systems that can extract, summarize, and reason over biomedical knowledge from large bodies of unstructured text, including scientific publications, EHR/EMR reports, and more. You’ll work closely with data scientists, biomedical domain experts, and product managers to design and implement reliable GenAI-powered workflows — from rapid prototypes to production-ready solutions. This is a highly strategic role as we continue to invest in agentic AI systems and LLM-native infrastructure to power the next generation of biomedical applications. Key Responsibilities: Build and maintain LLM-powered pipelines for entity extraction, ontology normalization, Q&A, and knowledge graph creation using tools like LangChain, LangGraph, and CrewAI. Fine-tune and deploy open-source LLMs (e.g., LLaMA, Gemma, DeepSeek, Mistral) for biomedical applications. Define evaluation frameworks to assess accuracy, efficiency, hallucinations, and long-term performance; integrate human-in-the-loop feedback. Collaborate cross-functionally with data scientists, bioinformaticians, product teams, and curators to build impactful AI solutions. Stay current with the LLM ecosystem and drive adoption of cutting-edge tools, models, and methods. Qualifications : 2–3 years of experience as an ML engineer, data scientist, or data engineer working on NLP or information extraction. Strong Python programming skills and experience building production-ready codebases. Hands-on experience with LLM frameworks and tooling (e.g., LangChain, HuggingFace, OpenAI APIs, Transformers). Familiarity with one or more LLM families (e.g., LLaMA, Mistral, DeepSeek, Gemma) and prompt engineering best practices. Strong grasp of ML/DL fundamentals and experience with tools like PyTorch, or TensorFlow. Ability to communicate ideas clearly, iterate quickly, and thrive in a fast-paced, product-driven environment. Good to Have (Preferred but Not Mandatory) Experience working with biomedical or clinical text (e.g., PubMed, EHRs, trial data). Exposure to building autonomous agents using CrewAI or LangGraph. Understanding of knowledge graph construction and integration with LLMs. Experience with evaluation challenges unique to GenAI workflows (e.g., hallucination detection, grounding, traceability). Experience with fine-tuning, LoRA, PEFT, or using embeddings and vector stores for retrieval. Working knowledge of cloud platforms (AWS/GCP) and MLOps tools (MLflow, Airflow etc.). Contributions to open-source LLM or NLP tooling We are proud to be an equal-opportunity workplace and are an affirmative action employer. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2