Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 15.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
DataArchitecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. DataModeling:Createand managelogical, physical, and conceptual data models to support various business applications and analytics. DatabaseDesign: Design and implement database solutions, including data warehouses, data lakes, and operational databases. DataIntegration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. DataGovernance:Implementand enforce data governance policies and procedures to ensure data quality, consistency, and security. TechnologyEvaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world’s leading brands Documentation:Createand maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. PerformanceTuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements Helpingproject teams withsolutions architecture,troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15yearsofexperienceindataarchitecture or related roles. Experiencewithbig data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertisewithcloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledgeofdataintegration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understandingofdatawarehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experiencewithdata governanceframeworks and tools. Show more Show less
Posted 5 days ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Overview About this role We are looking for an innovative hands-on technology leader and run Global Data Operations for one of the largest global FinTech’s. This is a new role that will transform how we manage and process high quality data at scale and reflects our commitment to invest in an Enterprise Data Platform to unlock our data strategy for BlackRock and our Aladdin Client Community. A technology first mindset, to manage and run a modern global data operations function with high levels of automation and engineering, is essential. This role requires a deep understanding of data, domains, and the associated controls. Key Responsibilities The ideal candidate will be a high-energy, technology and data driven individual who has a track record of leading and doing the day to day operations. Ensure on time high quality data delivery with a single pane of glass for data pipeline observability and support Live and breathe best practices of data ops such as culture, processes and technology Partner cross-functionally to enhance existing data sets, eliminating manual inputs and ensuring high quality, and onboarding new data sets Lead change while ensuring daily operational excellence, quality, and control Build and maintain deep alignment with key internal partners on ops tooling and engineering Foster an agile collaborative culture which is creative open, supportive, and dynamic Knowledge And Experience 8+ years’ experience in hands-on data operations including data pipeline monitoring and engineering Technical expert including experience with data processing, orchestration (Airflow) data ingestion, cloud-based databases/warehousing (Snowflake) and business intelligence tools The ability to operate and monitor large data sets through the data lifecycle, including the tooling and observability required to be ensure data quality and control at scale Experience implementing, monitoring, and operating data pipelines that are fast, scalable, reliable, and accurate Understanding of modern-day data highways, the associated challenges, and effective controls Passionate about data platforms, data quality and everything data Practical and detailed oriented operations leader Inquisitive leader who will bring new ideas that challenge the status quo Ability to navigate a large, highly matrixed organization Strong presence with clients Bachelor’s Degree in Computer Science, Engineering, Mathematics or Statistics Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join us as a Software Engineer This is an opportunity for a driven Software Engineer to take on an exciting new career challenge Day-to-day, you'll build a wide network of stakeholders of varying levels of seniority It’s a chance to hone your existing technical skills and advance your career We're offering this role at associate level What You'll Do In your new role, you’ll engineer and maintain innovative, customer centric, high performance, secure and robust solutions. You’ll be working within a feature team and using your extensive experience to engineer software, scripts and tools that are often complex, as well as liaising with other engineers, architects and business analysts across the platform. You’ll Also Be Producing complex and critical software rapidly and of high quality which adds value to the business Working in permanent teams who are responsible for the full life cycle, from initial development, through enhancement and maintenance to replacement or decommissioning Collaborating to optimise our software engineering capability Designing, producing, testing and implementing our working code Working across the life cycle, from requirements analysis and design, through coding to testing, deployment and operations The Skills You'll Need You’ll need at least five years of experience in data sourcing including real time data integration , and a certification in AWS cloud. You’ll Also Need Experience in AWS Cloud, Airflow, and associated data migration from on premise to cloud with knowledge on databases like Snowflake, AWS Data Lake, PostgreSQL, Oracle, MongoDB and AWS DynamoDB, Experience in multiple programming languages or Low Code toolsets, Kafka and Stream sets Experience of DevOps, Testing and Agile methodology and associated toolsets A background in solving highly complex, analytical and numerical problems Experience of implementing programming best practice, especially around scalability, automation, virtualisation, optimisation, availability and performance Show more Show less
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less
Posted 5 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Join us as a Machine Learning Engineer In this role, you’ll be driving and embedding the deployment, automation, maintenance and monitoring of machine learning models and algorithms Day-to-day, you’ll make sure that models and algorithms work effectively in a production environment while promoting data literacy education with business stakeholders If you see opportunities where others see challenges, you’ll find that this solutions-driven role will be your chance to solve new problems and enjoy excellent career development What you’ll do Your daily responsibilities will include you collaborating with colleagues to design and develop advanced machine learning products which power our group for our customers. You’ll also codify and automate complex machine learning model productions, including pipeline optimisation. We’ll expect you to transform advanced data science prototypes and apply machine learning algorithms and tools. You’ll also plan, manage, and deliver larger or complex projects, involving a variety of colleagues and teams across our business. You’ll Also Be Responsible For Understanding the complex requirements and needs of business stakeholders, developing good relationships and how machine learning solutions can support our business strategy Working with colleagues to productionise machine learning models, including pipeline design and development and testing and deployment, so the original intent is carried over to production Creating frameworks to ensure robust monitoring of machine learning models within a production environment, making sure they deliver quality and performance Understanding and addressing any shortfalls, for instance, through retraining Leading direct reports and wider teams in an Agile way within multi-disciplinary data and analytics teams to achieve agreed project and Scrum outcomes The skills you’ll need To be successful in this role, you’ll need to have a good academic background in a STEM discipline, such as Mathematics, Physics, Engineering or Computer Science. You’ll also have the ability to use data to solve business problems, from hypotheses through to resolution. We’ll look to you to have experience of at least twelve years with machine learning on large datasets, as well as experience building, testing, supporting, and deploying advanced machine learning models into a production environment using modern CI/CD tools, including git, TeamCity and CodeDeploy. You’ll Also Need A good understanding of machine learning approaches and algorithms such as supervised or unsupervised learning, deep learning, NLP with a strong focus on model development, deployment, and optimization Experience using Python with libraries such as NumPy, Pandas, Scikit-learn, and TensorFlow or PyTorch An understanding of PySpark for distributed data processing and manipulation with AWS (Amazon Web Services) including EC2, S3, Lambda, SageMaker, and other cloud tools. Experience with data processing frameworks such as Apache Kafka, Apache Airflow and containerization technologies such as Docker and orchestration tools such as Kubernetes Experience of building GenAI solutions to automate workflows to improve productivity and efficiency Show more Show less
Posted 5 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Join us as a Software Engineer This is an opportunity for a driven Software Engineer to take on an exciting new career challenge Day-to-day, you'll be engineering and maintaining innovative, customer centric, high performance, secure and robust solutions It’s a chance to hone your existing technical skills and advance your career while building a wide network of stakeholders We're offering this role at associate level What you'll do In your new role, you’ll be working within a feature team to engineer software, scripts and tools, as well as liaising with other engineers, architects and business analysts across the platform. You’ll Also Be Producing complex and critical software rapidly and of high quality which adds value to the business Working in permanent teams who are responsible for the full life cycle, from initial development, through enhancement and maintenance to replacement or decommissioning Collaborating to optimise our software engineering capability Designing, producing, testing and implementing our working software solutions Working across the life cycle, from requirements analysis and design, through coding to testing, deployment and operations The skills you'll need To take on this role, you’ll need at least four years of experience in software engineering, software design, and architecture, and an understanding of how your area of expertise supports our customers. You’ll Also Need Experience of working with development and testing tools, bug tracking tools and wikis Experience in AWS native services particularly S3, Glue, Lambda, IAM, and Elastic MapReduce Strong proficiency in Terraform for AWS cloud, Python for developing AWS lambdas, Airflow DAGs and shell scripting Experience with Apache Airflow for workflow orchestration Experience of DevOps and Agile methodology and associated toolsets Show more Show less
Posted 5 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less
Posted 5 days ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP services. Work closely with data analysts, data architects, and product teams to gather and understand data requirements. Manage and monitor BigQuery datasets, tables, and partitioning strategies. Implement error handling, resiliency, and observability mechanisms across pipeline components. Collaborate with DevOps teams to enable automated delivery (CI/CD) for data pipeline components. Required Skills: 5+ years of hands-on experience in Data Engineering or Software Engineering . Proficiency in Python and SQL . Good understanding of Java (for reading or modifying codebases). Experience building ETL pipelines with Apache Beam and Google Cloud Dataflow . Hands-on experience with Apache Kafka for stream processing. Solid understanding of BigQuery and data modeling on GCP. Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Compose, etc.). Good to Have: Experience building reusable ETL libraries or framework components. Knowledge of data governance, data quality checks, and pipeline observability. Familiarity with Apache Airflow or Cloud Composer for orchestration. Exposure to CI/CD practices in a cloud-native environment (Docker, Terraform, etc.). Tech stack : Python, SQL, Java, GCP (BigQuery, Pub/Sub, Cloud Storage, Cloud Compose, Dataflow), Apache Beam, Apache Kafka, Apache Airflow, CI/CD (Docker, Terraform)
Posted 5 days ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Job Title: Automation Tester - Selenium, python, databricks Candidate Specification: 7 + years, Immediate to 30 days. Job Description Experience with Automated Testing. Ability to code and read a programming language (Python). Experience in pytest, selenium(python). Experience working with large datasets and complex data environments. Experience with airflow, Databricks, Data lake, Pyspark. Knowledge and working experience in Agile methodologies. Experience in CI/CD/CT methodology. Experience in Test methodologies. Skills Required RoleAutomation Tester Industry TypeIT/ Computers - Software Functional Area Required Education B Tech Employment TypeFull Time, Permanent Key Skills SELENIUM PYTHON DATABRICKS Other Information Job CodeGO/JC/100/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We're looking for a dynamic Senior Data Scientist to join our team, working alongside our founders, financial analysts, product managers, and engineers. This is your chance to get hands-on with massive datasets, crafting the metrics that matter to institutional investors worldwide. For those who are curious and passionate about financial technologies, this is a great opportunity to work at an extremely well-capitalized startup with a proven team of senior financial and tech industry talent to build the future of investing tools. Responsibilities: Work with large and complex data sets and derive investment-relevant metrics in close partnership with financial analysts and engineers. Apply knowledge of statistics, programming, data modeling, simulation, and advanced mathematics to recognize patterns, identify opportunities, pose business questions, and make valuable discoveries leading to the development of fundamental metrics needed to evaluate various assets. Implement the risk factor model using advanced statistical and programming techniques, ensuring its performance aligns with the proposed design. Conduct rigorous testing and validation of the model to ensure its effectiveness in identifying, quantifying, and managing various risk factors. Build customer-facing metrics and dashboards. Work closely with analysts, engineers, and Product Managers and provide feedback as we develop our data analytics and research platform. Requirements: Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical experience (or) degree in an analytical field (e. g. Computer Science, Engineering, Mathematics, Statistics, Operations Research, Management Science). IIT preferred. 3+ years of experience with data analysis and metrics development. 3+ years of experience analyzing and interpreting data, drawing conclusions, defining recommended actions, and reporting results across stakeholders. 2+ years of experience writing SQL queries. 2+ years of experience scripting in Python. Interested in learning new technologies to solve customer needs with lots of creative freedom. Strong communication skills and business acumen. Self-starter, motivated by an interest in developing the best possible solutions to problems. Strong SQL and Python programming skills. Experience with Airflow, Google BigQuery, and Clickhouse is a plus. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Job Title: Senior Full Stack Developer – Python + React JS - Chennai Job Overview Candidates with 5+years of experience in python development with 3+ years of experience in React/Next.js preferred. Key Responsibilities Proven delivery of production APIs using FastAPI (or Flask/Django Rest Framework) Deep SQL skills and hands‑on experience with SQL Alchemy (or comparable ORM) and Microsoft SQL Server (or similar) Comfortable with async programming, task queues (Celery/RQ) or workflow engines (Prefect/Airflow) Graph QL experience: query design, schema stitching, client integration, etc. Strong testing culture: pytest, tox, coverage, mocking, etc. CI/CD experience: GitHub Actions / Azure DevOps, Docker build & registry, automated quality gates Skills Should be experienced with Python- fast api or flask api, React JS,Next JS, Python, Fast API, Graph QL, Azure Devops. Skills Required RoleSenior Full Stack Developer – Python + React JS - Chennai Industry TypeIT/ Computers - Software Functional Area Required Education B Tech Employment TypeFull Time, Permanent Key Skills AZURE DEV OPS PYTHON REACT JS SQL Other Information Job CodeGO/JC/156/2025 Recruiter NameBrindha Kamaraj Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
What You'll Do We are looking for an experienced Senior Data Engineer to join our Data Operations team. The ideal candidate will have expertise in Python, Snowflake, SQL, modern ETL tools, and business intelligence platforms such as Power BI. You will require experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs. You will build and maintain data pipelines, developing data models, and ensuring seamless data integrations that support business analytics and reporting. The role requires flexibility to collaborate in US time zones as needed. You will report to Manager, Finance Applications. What Your Responsibilities Will Be Design, develop, and maintain scalable data pipelines and workflows using modern ETL tools and Python. Build and optimize SQL queries and data models on Snowflake to support analytics and reporting needs. Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors. You will develop and support dashboards and reports using Power BI and other reporting tools. Work with data analysts, business users, and other engineering teams to gather requirements and deliver high-quality solutions. Ensure data quality, accuracy, and consistency across systems and datasets. Write clean, well-documented, and testable code with a focus on performance and reliability. Participate in peer code reviews and contribute to best practices in data engineering. Be available for meetings and collaboration in US time zones as required. What You’ll Need To Be Successful You have 5+ years' experience in data engineering field, with deep SQL knowledge. Experience in Snowflake - SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow) is must. Proficiency in Python for data transformation and scripting. Proficiency in writing complex SQL queries, Stored Procedures. Experience in Data Warehouse, data modeling and ETL design concepts. Have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP...etc. Knowledge of AWS technologies (EC2, S3, RDS, Redshift...etc.) Excellent communication skills, with the ability to translate technical issues for non-technical stakeholders. Flexibility to work during US business hours as required for team meetings and collaboration. How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re Avalara. We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business . Our growth is real, and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. Ownership and achievement go hand in hand here. We instill passion in our people through the trust we place in them. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know. Show more Show less
Posted 5 days ago
14.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. The Manager of Data Engineering is a leader with a passion for leveraging technical solutions to address business challenges. This role requires deep expertise in agile development practices and a commitment to extreme ownership of software products and platforms. The ideal candidate will collaborate closely with Product Managers and Engineering teams to drive both transformational and operational initiatives. Key Responsibilities Would Include Motivate a high-performing team to achieve outstanding results while fostering individual growth and development. Proven track record of leading teams that have successfully built and launched products in highly scalable growth markets. Experience in building teams and leading organizational change within a fast-paced and highly regarded technology company. Partners with business product leadership to develop strategies and technology roadmaps. Positive client engagement at senior levels, managing and navigating relationships with peers, business line leaders, and engagement across the Protect matrix of services. IT solution development that aligns with the company's strategic architecture. Manage vendor solutions and deliverables. Responsible for the delivery of complex Data Engineering projects, including data marts, data lakes, data interchanges, and data warehouses. Responsible for managing the costs, optimizations, and overall cost strategies for cloud computing in your domain. Experience Job Qualifications: Total experience must be in range of 14-16 years. 10+ years of progressive experience in Data Engineering. Must be having 5+ years of exp. in MSSQL, Data Flow, GCP and Snowflake 5 – 8 years of experience leading technology teams. Proven success in delivering complex, high-impact projects with measurable business outcomes. Minimum 3 years of support in cloud-based data technologies. Education A BS in Computer Science or an equivalent degree is highly preferred. Technical Expertise Experience developing or leading development teams using the following technologies (or similar): Google Dataflow, Google Airflow, Microsoft Sql Server Integration Services (SSIS), PostgreSQL, Google BigQuery, Google Cloud Platform, REST Services, Data Visualization tools (e.g. PowerBI, SSRS) Industry Experience: Background in the Property and Casualty Insurance industry is preferred. Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less
Posted 5 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Location -Bangalore(Whitefield) Work Mode - WFO during probation(3 Months), later Hybrid Role overview Experience 1. Ten plus years’ experience in software development with at least the last few years in a leadership role with progressively increasing responsibilities 2. Extensive experience in the following areas a. C#, .Net b. Designing and building cloud-native solutions (Azure, AWS, Google Cloud Platform) c. Infrastructure as Code tools (Terraform, Pulumi, cloud-specific IaC tools) d. Configuration management tools (Ansible, Chef, Salt Stack) e. Containerization and orchestration technologies (Docker, Kubernetes) f. Native and third-party Databricks integrations (Delta Live Tables, Auto Loader,Databricks Workflows / Apache Airflow, Unity Catalog) 3. Extensive experience in Azure 4. Experience designing and implementing data security and governance platform adhering to compliance standards (HIPPA, SOC 2) preferred Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Location: Remote/Hybrid (India-based preferred) Type: Full-Time Must Haves (Don’t Apply If You Miss Any) 3+ years experience in Data Engineering Proven hands-on with ETL pipelines (end-to-end ownership) AWS Resources: Deep experience with EC2, Athena, Lambda, Step Functions (non-negotiable; critical to the role) Strong with MySQL (not negotiable) Docker (setup, deployment, troubleshooting) Good To Have (Adds Major Value) Airflow or any modern orchestration tool PySpark experience Python Ecosystem SQL Alchemy DuckDB PyArrow Pandas Numpy DLT (Data Load Tool). About You You’re a builder, not just a maintainer. You can work independently but communicate crisply. You thrive in fast-moving, startup environments. You care about ownership and impact, not just code. Include the Code word Red Panda in your message application, so that we know you have read this section. What You’ll Do Architect, build, and optimize robust data pipelines and workflows Own AWS resource configuration, optimization, and troubleshooting Collaborate with product and engineering teams to deliver business impact fast Automate and scale data processes—no manual work culture Shape the data foundation for real business decisions Cut to the chase. Only serious, relevant applicants will be considered. Show more Show less
Posted 5 days ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 5 days ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Your Team Responsibilities MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Your Skills And Experience That Will Help You Excel Prior senior Software Architecture roles Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. Drive the development of conceptual, logical, and physical data models aligned with business requirements. Lead the implementation and optimization of data technologies, including Apache Spark. Experience with one of the table formats, such as Delta, Iceberg. Strong hands-on experience in data architecture, database design, and data modeling. Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Ability to dive into details, hands on technologist with strong core computer science fundamentals. Strong preference for financial services experience Proven leadership of large-scale distributed software teams that have delivered great products on deadline Experience in a modern iterative software development methodology Experience with globally distributed teams and business partners Experience in building and maintaining applications that are mission critical for customers M.S. in Computer Science, Management Information Systems or related engineering field 15+ years of software engineering experience Demonstrated consensus builder and collegial peer About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less
Posted 5 days ago
5.0 - 9.0 years
19 - 23 Lacs
Mumbai
Work from Office
Overview MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Responsibilities Engages technical teams and business stakeholders to discuss and propose technical approaches to meet current and future needs • Defines the technical target state of the product and drives achievement of the strategy • As the Lead Architect you will be responsible for leading the design, development, and maintenance of our data architecture, ensuring scalability, efficiency, and reliability. • Create and maintain comprehensive documentation for the architecture, processes, and best practices including Architecture Decision Records (ADRs). • Evaluates recommendations and provides feedback on new technologies • Develops secure and high-quality production code, and reviews and debugs code written by others Informaon Classificaon: GENERAL • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems • Collaborating with a cross functional team to draft, implement and adapt the overall architecture of our products and support infrastructure in conjunction with software development managers, and product management teams • Staying abreast of new technologies and issues in the software-as-a-service industry, including current technologies, platforms, standards and methodologies • Being actively engaged in setting technology standards that impact the company and its offerings • Ensuring the knowledge sharing of engineering best practices across departments; and developing and monitoring technical standards to ensure adherence to them. Qualifications Prior senior Software Architecture roles • Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. • Drive the development of conceptual, logical, and physical data models aligned with business requirements. • Lead the implementation and optimization of data technologies, including Apache Spark. • Experience with one of the table formats, such as Delta, Iceberg. • Strong hands-on experience in data architecture, database design, and data modeling. • Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. • Experience with cloud platforms such as AWS, Azure, or Google Cloud. • Ability to dive into details, hands on technologist with strong core computer science fundamentals. • Strong preference for financial services experience • Proven leadership of large-scale distributed software teams that have delivered great products on deadline • Experience in a modern iterative software development methodology • Experience with globally distributed teams and business partners • Experience in building and maintaining applications that are mission critical for customers • M.S. in Computer Science, Management Information Systems or related engineering field • 15+ years of software engineering experience • Demonstrated consensus builder and collegial peer What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 5 days ago
5.0 - 10.0 years
7 - 17 Lacs
Hyderabad, Pune, Chennai
Work from Office
Airflow Data Engineer in AWS platform Job Title Apache Airflow Data Engineer ROLE” as per TCS Role Master • 4-8 years of experience in AWS, Apache Airflow (on Astronomer platform), Python, Pyspark, SQL • Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. • Experience in creating data pipelines and orchestrating using Apache Airflow • Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. • Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR or Matillion or any other ELT tool • Excellent communication skills to liaise with Business & IT stakeholders. • Expertise in planning execution of a project and efforts estimation. • Exposure to working in Agile ways of working. Candidate for this position to be offered with TAIC or TCSL as Entity Data warehousing , pyspark , Github, AWS data platform, Glue, EMR, RedShift, databricks,Data Marts. DBT/Glue/EMR or Matillion, data engineering, data modelling, data consumption
Posted 6 days ago
3.0 - 8.0 years
15 - 20 Lacs
Chennai, Sholinganallur
Hybrid
Position Description:Bachelors Degree 2+Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years inData Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strongexperience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exportingSkills Required:Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API
Posted 6 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Data Ops Capability Deployment - Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new solutions/frameworks/techniques and the improvement of processes and workflow for Enterprise Data function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. The primary purpose of this role is to perform data analytics and data analysis across different asset classes, and to build data science/Tooling capabilities within the team. This will involve working closely with the wider Enterprise Data team, in particular the front to back leads to deliver business priorities. The following role is within B & I Data Capabilities team within the Enterprise Data. The team manages the Data quality/Metrics/Controls program in addition to a broad remit to implement and embed improved data governance and data management practices throughout the region. The Data quality program is centered on enhancing Citi’s approach to data risk and addressing regulatory commitments in this area. Key Responsibilities: Hands on with data engineering background and have thorough understanding of Distributed Data platforms and Cloud services. Sound understanding of data architecture and data integration with enterprise applications Research and evaluate new data technologies, data mesh architecture and self-service data platforms Work closely with Enterprise Architecture Team on the definition and refinement of overall data strategy Should be able to address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities. Ability to perform complex data analytics (data cleansing, transformation, joins, aggregation etc.) on large complex datasets. Build analytics dashboards & data science capabilities for Enterprise Data platforms. Communicate complicated findings and propose solutions to a variety of stakeholders. Understanding business and functional requirements provided by business analysts and convert into technical design documents. Work closely with cross-functional teams e.g. Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Control and Production Support. Prepare handover documents and manage SIT, UAT and Implementation. Demonstrate an in-depth understanding of how the development function integrates within overall business/technology to achieve objectives; requires a good understanding of the banking industry. Performs other duties and functions as assigned. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Skills & Qualifications 10 + years of active development background and experience in Financial Services or Finance IT is a required. Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools Hands on experience for ETL using PySpark on distributed platforms along with data ingestion, Spark optimization, resource utilization, capacity planning & batch orchestration. In depth understanding of Hive, HDFS, Airflow, job scheduler Strong programming skills in Python with experience in data manipulation and analysis libraries (Pandas, Numpy) Should be able to write complex SQL/Stored Procs Should have worked on DevOps, Jenkins/Lightspeed, Git, CoPilot. Strong knowledge in one or more of the BI visualization tools such as Tableau, PowerBI. Proven experience in implementing Datalake/Datawarehouse for enterprise use cases. Exposure to analytical tools and AI/ML is desired. Education: Bachelor's/University degree, master's degree in information systems, Business Analysis / Computer Science. ------------------------------------------------------ Job Family Group: Data Governance ------------------------------------------------------ Job Family: Data Governance Foundation ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role : MLOps Engineer Location - Chennai - CKC Mode of Interview - In Person Data - 7th June 2025 (Saturday) Key Words -Skillset AWS SageMaker, Azure ML Studio, GCP Vertex AI PySpark, Azure Databricks MLFlow, KubeFlow, AirFlow, Github Actions, AWS CodePipeline Kubernetes, AKS, Terraform, Fast API Responsibilities Model Deployment, Model Monitoring, Model Retraining Deployment pipeline, Inference pipeline, Monitoring pipeline, Retraining pipeline Drift Detection, Data Drift, Model Drift Experiment Tracking MLOps Architecture REST API publishing Job Responsibilities Research and implement MLOps tools, frameworks and platforms for our Data Science projects. Work on a backlog of activities to raise MLOps maturity in the organization. Proactively introduce a modern, agile and automated approach to Data Science. Conduct internal training and presentations about MLOps tools’ benefits and usage. Required Experience And Qualifications Wide experience with Kubernetes. Experience in operationalization of Data Science projects (MLOps) using at least one of the popular frameworks or platforms (e.g. Kubeflow, AWS Sagemaker, Google AI Platform, Azure Machine Learning, DataRobot, DKube). Good understanding of ML and AI concepts. Hands-on experience in ML model development. Proficiency in Python used both for ML and automation tasks. Good knowledge of Bash and Unix command line toolkit. Experience in CI/CD/CT pipelines implementation. Experience with cloud platforms - preferably AWS - would be an advantage. Show more Show less
Posted 6 days ago
6.0 - 9.0 years
17 - 20 Lacs
Hyderabad, Kondapur
Hybrid
Manadatory Skills: • Airflow, • python, • AWS and • Big Data technologies like Spark
Posted 6 days ago
4.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
About This Role Aladdin Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin. Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability. Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role. And Ideal candidate would have At least 4+ years’ experience as a data engineer Experience in SQL, Sybase, Linux is a must Experience coding in two of these languages for server side/data processing is required Java, Python, C++ 2+ years experience using modern data stack (spark, snowflake, Big Query etc.) on cloud platforms (Azure, GCP, AWS) Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) Experience with Database Modeling, Normalization techniques Experience with object-oriented design patterns Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps Experience with Agile development concepts and related tools Ability to trouble shoot and fix performance issues across the codebase and database queries Excellent written and verbal communication skills Ability to operate in a fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances BA/BS or equivalent practical experience Skills That Would Be a Plus Perl, ETL tools (Informatica, Talend, dbt etc.) Experience with Snowflake or other Cloud Data warehousing products Exposure with Workflow management tools such as Airflow Exposure to messaging platforms such as Kafka Exposure to NoSQL platforms such as Cassandra, MongoDB Building and Delivering REST APIs Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
🚀 We're Hiring: Senior Software Engineer – GCP | Python | Angular We’re looking for a highly skilled and passionate Software Engineer to join our fast-paced, product-focused engineering team. In this role, you’ll be involved in end-to-end development — from design and implementation to testing, deployment, and support. If you thrive in a modern cloud-native, CI/CD-driven development environment and enjoy working on impactful features across the stack, we’d love to hear from you! 📍 Location: Chennai 💼 Join a cutting-edge digital team powering innovation for a globally renowned automotive and mobility leader. 🔧 Key Skills Required : Languages & Frameworks : Python, Java, JavaScript (Node.js), Angular, RESTful APIs Cloud & DevOps : Google Cloud Platform (GCP), Cloud Run, BigQuery, Git, Jenkins, CI/CD Data & Infrastructure : Dataflow, Terraform, Airflow Testing & Best Practices : Jest, Mocha, TDD, Clean Code, Design Patterns 👤 Experience & Qualifications : 5+ years of professional software development experience Bachelor’s degree in Computer Science, Engineering, or related field Experience building scalable full-stack solutions in a cloud environment Strong understanding of Agile, CI/CD, and DevOps practices ✨ Why Join Us? Work on cutting-edge tech including LLM integrations Be part of a team that values quality, ownership, and innovation Collaborate across product, engineering, and DevOps in a cloud-native setup 📩 Interested? Drop your profile or DM for a quick conversation. 📌 Immediate to 30 days joiners preferred #Hiring #SoftwareEngineer #FullStackDeveloper #Python #Angular #GCP #CloudRun #BigQuery #DevOps #LLM #CI/CD #ImmediateJoiners #ChennaiJobs #GoogleCloud Show more Show less
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.