Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
8 - 12 Lacs
bengaluru
Work from Office
Position - AI/ML Engineer Experience - 6+ years Location - Bangalore (only WFO) Job description - Develop and fine-tune natural language processing (NLP) models for customer matching, deduplication Build and optimize MLOps pipelines to streamline AI/ML workflows Integrate large language models (LLMs) into enterprise systems Contribute to technical decision-making for AI tools and frameworks Stay current on emerging AI technologies and contribute to proof-of-concept projects Data Processing and Management: Working with large datasets, preprocessing data, and ensuring its quality and accuracy for model training. Model Deployment and Monitoring: Deploying machine learning models into production environments and continuously monitoring their performance, making adjustments as needed. Staying Current with Advancements: Keeping up with the latest developments in AI and ML technologies, including new algorithms, frameworks, and best practices Knowledge in Snowflake Cortex LLM
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
hyderabad
Work from Office
About Us Loop is Indias fastest growing Insurance broker. We help enterprises manage their employee health benefits with transparency, technology, and trust. We re looking for enrolment champions for smooth management of their enrolment and endorsements. Key Responsibilities Coordinate end-to-end onboarding and enrolment of employees under group insurance schemes Liaise with corporate HR teams to collect and verify employee and dependent data Ensure timely and accurate data entry into the insurer s and internal systems Manage mid-term additions, deletions, and endorsement requests Conduct awareness sessions or webinars for employees on insurance benefits and claim processes Troubleshoot queries and coordinate with insurance partners to resolve issues Maintain detailed records of enrolments, timelines, and discrepancies Collaborate with the customer success, sales, and tech teams for process improvement Requirements 3+ years of experience in insurance, HR services, or employee benefits domain Excellent communication and client servicing skills Proficient in MS Excel and insurance or CRM tools Strong organisational skills and ability to meet deadlines Experience handling high-volume data processing IRDAI license (or willingness to acquire one) Preferred Skills Prior experience in a broker or insurance tech startup Exposure to working with enterprise clients or large-scale employee groups What We Offer Competitive salary + incentives Opportunity to work with top enterprise clients Training and development in the health insurance domain Fast-growing team with career growth opportunities
Posted 1 week ago
5.0 - 7.0 years
9 - 13 Lacs
pune, bengaluru
Work from Office
The purpose of this role is to provide technical guidance and suggest improvements in development processes. Develop required software features, achieving timely delivery in compliance with the performance and quality standards of the company. Job Description: Key Responsibilities Experience: 5-7 Years Lead the design and implementation of complex data solutions with a business-centric approach. Guide junior developers and provide technical mentorship. Ensure alignment of data architecture with marketing and business strategies. Work within Agile development teams, contributing to sprints and ceremonies. Design and implement CI/CD pipelines to support automated deployments and testing. Apply data engineering best practices to ensure scalable, maintainable codebases. Develop robust data pipelines and solutions using Python and SQL. Understand and manipulate business data to support marketing and audience targeting efforts. Collaborate with cross-functional teams to deliver data solutions that meet business needs. Communicate effectively with stakeholders to gather requirements and present solutions. Follow best practices for data processing and coding standards. Skills Proficient in Python for data manipulation and automation. Strong experience with SQL development (knowledge of MS SQL is a plus). Excellent written and oral communication skills. Deep understanding of business data, especially as it relates to marketing and audience targeting. Experience with Agile methodologies and CI/CD processes Experience with MS SQL. Familiarity with SAS. Good to have B2B and AWS knowledge Nice to have Hands-on experience with orchestration and automation tools such as Snowflake Tasks and Streams. Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
3.0 - 5.0 years
6 - 9 Lacs
pune, bengaluru
Work from Office
The purpose of this role is to develop required software features, achieving timely delivery in compliance with the performance and quality standards of the company. Job Description: Responsibilities: Design, develop, and maintain robust data pipelines and workflows using Snowflake features including: Stored Procedures Tasks and Streams Snowpark for Python Write optimized and maintainable SQL and Python code for data processing and transformation. Collaborate with cross-functional teams to build and support marketing data products. Ensure data quality, integrity, and governance across all data products. Participate in Agile ceremonies (standups, planning, retrospectives), contributing to sprint planning and delivery. Understand data requirements and deliver scalable solutions. Monitor and optimize data pipeline performance and cost-efficiency in Snowflake. Create technical documentation and maintain Confluence pages for knowledge sharing. Required Skills & Qualifications: 3-5 years of experience in Data Engineering or a related field. Strong expertise in Snowflake development: Stored Procedures Tasks/Streams SQL Snowpark for Python Proficient in Python programming, especially for data manipulation and automation. Hands-on experience with marketing data , attribution models, campaign data, and customer segmentation. Familiarity with the Agile development process and tools. Excellent written and verbal communication skills . Good to Have: Experience with JIRA and Confluence . Exposure to CI/CD practices using GitHub and/or Bitbucket . Understanding of orchestration and automation tools . Knowledge of modern data architectures and cloud-native solutions. Location: DGS India - Bengaluru - Manyata H2 block Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
0.0 - 2.0 years
1 - 3 Lacs
pune
Work from Office
Roles and Responsibilities: Manage back-office operations to ensure smooth workflow and efficient support. Handle accurate data processing, documentation, and record-keeping. Coordinate with internal team members to achieve project deadlines and organizational goals. Maintain and update records, reports, and databases using MS Excel and related tools. Support management with administrative tasks and provide timely reports for decision-making. Preferred Candidate Profile: Degree in BA/B.Com/BBA or related field Freshers & experienced both welcome Strong problem-solving & analytical skills Excellent verbal & written communication (Hindi & English preferred) Ability to coordinate with multiple stakeholders Benefits: Cell phone reimbursement and Salary as per Industry Standards
Posted 1 week ago
2.0 - 3.0 years
6 Lacs
mumbai
Work from Office
MUMBAI (ON-SITE) | 2-3 years Experience | FULL TIME Job Description: Design and Optimize Strategies: Implement, refine, and optimize trading strategies and ML models based on market data, order flow, and trade data from market quotes. Backtest these strategies using Python, R, and other relevant tools. Investment Idea Generation: Generate short, medium, and long-term investment ideas and strategies through rigorous quantitative research and data analysis. AI and ML in Finance: Apply cutting-edge AI and ML techniques in stock selection, portfolio construction, and strategy development. Data Processing and Analysis: Work with large datasets of equity and derivatives, analyzing market data to extract meaningful insights that can improve model performance and decision-making. Eligibility criteria: Bachelor s or master s degree in quantitative finance, Mathematics, Computer Science, Statistics, Physics, Engineering, or related quantitative disciplines. Basic understanding of financial markets, derivatives, equities & Statistical concepts. Proficiency in programming languages such as Python and R (other languages like C++, Java are a plus). Excellent communication skills, to collaborate with other teams and clients. Strong analytical and problem-solving skills. Thank you for your submission! There was an error submitting your form. Please try again.
Posted 1 week ago
8.0 - 15.0 years
4 - 7 Lacs
hyderabad
Work from Office
About the Role We are seeking a highly skilled and motivated Lead Data Engineer to design, develop, and manage scalable data pipelines and modern data platforms. The ideal candidate will have strong expertise in ETL/ELT, SQL, PySpark, Azure Cloud, and Snowflake, along with proven leadership experience in guiding teams and delivering enterprise-grade data solutions. Key Responsibilities Design, build, and optimize scalable and reliable ETL/ELT pipelines using SQL, PySpark, and cloud-native services. Implement data ingestion, transformation, and integration workflows from multiple structured/unstructured data sources. Ensure data quality, integrity, and performance across the data lifecycle. Cloud & Data Platform Management Architect and manage data solutions on Azure (ADF, Data Lake, Databricks, Synapse). Leverage Snowflake for data modeling, performance tuning, and advanced analytics. Implement security, governance, and monitoring best practices. Leadership & Collaboration Lead and mentor a team of data engineers, setting technical direction and enforcing coding/architecture standards. Collaborate with data architects, analysts, and business stakeholders to define data requirements and deliver solutions. Drive adoption of modern data engineering practices, CI/CD, and automation. Performance & Optimization Optimize large-scale data processing using PySpark and distributed systems. Implement best practices for query tuning, cost optimization, and pipeline efficiency. Technical Expertise: Strong hands-on experience with ETL/ELT design and development. Proficiency in SQL (advanced) for complex queries, performance tuning, and stored procedures. Expertise in PySpark and distributed data processing frameworks. Solid experience with Azure services (ADF, ADLS, Databricks, Synapse, Key Vault, etc.). Strong working knowledge of Snowflake (data modeling, SnowSQL, task/stream management, query optimization).
Posted 1 week ago
3.0 - 7.0 years
4 - 5 Lacs
mumbai, navi mumbai
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. Required Skills and Compete ncies: - MS Fabric Design and implement data solutions using Microsoft Fabric components (Power BI, Synapse, Data Engineering, Data Factory, Data Science, and Data Activator). Develop robust ETL/ELT pipelines leveraging Data Factory and Synapse Pipelines. Build and manage lakehouses, data warehouses, and datasets in OneLake. Create insightful and interactive Power BI reports and dashboards to meet business needs. Collaborate with data scientists, analysts, and business stakeholders to deliver integrated data solutions. Optimize performance of data pipelines and reporting solutions. Implement best practices for data governance, security, and compliance in the Microsoft Fabric ecosystem. Stay updated on the latest Fabric features and suggest enhancements or adoptions where applicable. Detail - 10+ years of experience in Azure Data Services, Data Architecture, and Cloud Infrastructure. 3+ Years exp in MS Fabric Design and implement data solutions using Microsoft Fabric components (Power BI, Synapse, Data Engineering, Data Factory, Data Science, and Data Activator). Expertise in Microsoft Purview, Data Governance, and Security frameworks. Experience in performance tuning on Microsoft Fabric & Azure. Proficiency in SQL, Python, PySpark, and Power BI for data engineering and analytics. Experience in DevOps for Data (CI/CD, Terraform, Bicep, GitHub Actions, ARM Templates). Strong problem-solving and troubleshooting skills in Azure/Fabric & Data Services. Data Flows: Design and technical skill data flows within the Microsoft Fabric environment. Storage Strategies: Implement OneLake storage strategies. Analytics Configuration: Configure Synapse Analytics workspaces. Migration: Experience in potential migration from their existing data platforms like Databricks/Spark, etc to Microsoft Fabric Integration Patterns: Establish Power BI integration patterns. Data Integration: Architect data integration patterns between systems using Azure Databricks/Spark and Microsoft Fabric. Delta Lake Architecture: Design Delta Lake architecture and implement medallion architecture (Bronze/Silver/Gold layers). Real-Time Data Ingestion: Create real-time data ingestion patterns and establish data quality frameworks. Data Governance: Establish data governance frameworks incorporating Microsoft Purview for data quality, lineage, and compliance. Security: Implement row-level security, data masking, and audit logging mechanisms. Pipeline Development: Design and implement scalable data pipelines using Azure Databricks/Spark for ETL/ELT processes and real-time data integration. Performance Optimization: Implement performance tuning strategies for large-scale data processing and analytics workloads. Analytical Skills: Strong analytical and problem-solving skills. Communication: Excellent communication and teamwork skills. Certifications: Relevant certifications in Microsoft data platforms are a plus. Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 1 week ago
4.0 - 7.0 years
8 - 12 Lacs
bengaluru
Work from Office
About the role Together with PSS Operations Lead, you will play a key role in supporting PSS data-driven decision-making processes by developing and maintaining robust data models, automated pipelines, and analytical tools. The role requires both technical fluency and a sharp analytical mindset to convert complex data into meaningful insights for PSS Client Managers. Key responsibilities: Data Processing: Implement big data processing workflows and pipelines to handle large-scale datasets efficiently. Model Development and Maintenance : Design, develop, test deploy, retrain learning models and algorithms to solve complex business problems in collaboration with other Swiss Re teams to optimize PSS delivery for business growth. Automation: Promote and implement best practices to develop automated data extraction solutions for pipeline and pricing data. Assess new and promising technologies from the GenAI within group. Platform Management: Maintain and enhance our data platforms to ensure high performance and reliability. Collaboration: Work closely with business stakeholders to understand their needs, provide insights, and deliver tailored solutions. KPI Tracking: Support management to deliver on business goals. About the Team Swiss Re Corporate Solutions is the commercial insurance arm of the Swiss Re Group. We offer innovative insurance solutions to large and midsized multinational corporations from our approximately 50 locations worldwide. We help clients mitigate their risk exposure, whilst our industry-leading claims service provides them with additional peace of mind. About You Degree in Computer Science, Data Science, Finance, or related field 4+ years of experience as a Business Analyst, Data Engineer, or similar role Strong proficiency in Excel, including advanced modelling and macros Experience in data modelling and API design Practical experience in building automated data transformation workflows and pipelines Strong technical and quantitative skills; proficient with tools for data analytics and reporting Excellent analytical and problem-solving skills with strong attention to detail Proven ability to communicate complex data in an accessible and actionable way Comfortable working under pressure, handling multiple tasks, and managing diverse stakeholder needs Self-starter with the ability to work independently with minimal direction Positive, goal-oriented mindset with a "can-do" attitude Fluent in English; additional languages are a plus About Swiss Re Swiss Re is one of the world s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 135220
Posted 1 week ago
7.0 - 9.0 years
30 - 35 Lacs
chennai
Work from Office
Job Description We are looking for a highly skilled Lead Data Analyst with strong expertise in Data Warehousing & Analytics to join our team. The ideal candidate will have extensive experience in designing and managing data solutions, advanced SQL proficiency, and hands-on expertise in Python. Key Responsibilities: Design, develop, and maintain scalable data warehouse solutions. Write and optimize complex SQL queries for data extraction, transformation, and reporting. Develop and automate data pipelines using Python. Work with AWS cloud services for data storage, processing, and analytics. Collaborate with cross-functional teams to provide data-driven insights and solutions. Ensure data integrity, security, and performance optimization Qualifications 7- 9 years of experience in Data Warehousing & Analytics. Strong proficiency in writing complex SQL queries with deep understanding of queryoptimization, stored procedures, and indexing. Hands-on expe
Posted 1 week ago
7.0 - 8.0 years
13 - 17 Lacs
noida
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Responsibilities: Lead the development and deployment of AIML/generative AI models, managing full project lifecycles from conception to delivery. Collaborate with senior stakeholders to identify strategic opportunities for AI applications, aligning them with business objectives. Collaborate/Oversee teams of data scientists and engineers, providing guidance, mentorship, and ensuring high-quality deliverables. Drive research and innovation in AI techniques and tools, fostering an environment of continuous improvement and learning. Ensure compliance with data governance standards and ethical AI practices in all implementations. Present AI insights and project progress to clients and internal leadership, adapting technical language to suit audience expertise levels. Mandatory skill sets: AI/ML, NLP, and Generative AI models, Preferred skill sets: Azure, Google Cloud Years of experience required: 7-8 Years Education qualification: Advanced degree (Master s or Ph.D.) in Computer Science, Artificial Intelligence, Data Science, or related discipline. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Doctor of Philosophy, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Azure Devops Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Strategy {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date
Posted 1 week ago
3.0 - 5.0 years
9 - 12 Lacs
pune
Work from Office
Job Title: Big Data Nifi Developer About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Title: Big Data Nifi Developer Location: Pune (Hybrid) Experience: 3 to 5 Years Work Mode: Hybrid (2-3 days from client office, rest remote) We are seeking a highly skilled and motivated Big Data NiFi Developer to join our growing data engineering team in Pune. The ideal candidate will have hands-on experience with Apache NiFi, strong understanding of big data technologies, and a background in data warehousing or ETL processes. If you are passionate about working with high-volume data pipelines and building scalable data integration solutions, we d love to hear from you. Key Responsibilities: Design, develop, and maintain data flow pipelines using Apache NiFi. Integrate and process large volumes of data from diverse sources using Spark and NiFi workflows . Collaborate with data engineers and analysts to transform business requirements into data solutions. Write reusable, testable, and efficient code in Python or Java or Scala. Develop and optimize ETL/ELT pipelines for performance and scalability. Ensure data quality, consistency, and integrity across systems. Participate in code reviews, unit testing, and documentation. Monitor and troubleshoot production data workflows and resolve issues proactively. Skills & Qualifications: 3 to 5 years of hands-on experience in Big Data development. Strong experience with Apache NiFi for data ingestion and transformation. Proficient in at least one programming language: Python, Scala, or Java. Experience with Apache Spark for distributed data processing. Solid understanding of Data Warehousing concepts and ETL tools/processes. Experience working with large datasets, batch and streaming data processing. Knowledge of Hadoop ecosystem and cloud platforms (AWS, Azure, or GCP) is a plus. Excellent problem-solving and communication skills. Ability to work independently in a hybrid work environment. Nice to Have: Experience with NiFi registry and version control integration. Familiarity with containerization tools (Docker/Kubernetes). Exposure to real-time data streaming tools like Kafka.
Posted 1 week ago
2.0 - 8.0 years
9 - 10 Lacs
kolkata
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Role : Azure Data Engineer Exp : 4 8 Years Location: Mumbai Job Description (Azure Data Engineer): Roles and Responsibilities: Responsible for data management activities related to the migration of on-prem sources to cloud system using Microsoft Azure PaaS Services and Azure Cloud Technologies, including the creation and use of ingestion pipelines, data lakes, cloud-based data marts & data warehouses, cloud-based semantic data services layer. Desired Candidate Profile: -4 to 8 years of professional experience with working knowledge in a Data and Analytics role with a Global organization -2+ years of experience in Azure cloud infrastructure for data services -Experience in leading development of Data and Analytics products, from Requirement Gathering State to Driving User Adoption -Hands-on experience of developing, deploying, and running cloud solutions on Azure services like Azure Data Lake Storage, Azure Data Lake Analytics, Azure Data Factory, Synapse. -Candidates with strong data transformation experience on ADF and ADB (Pyspark/Delta) are preferred -Strong proficiency in writing and optimizing SQL queries and working with databases -Ability to acquire specialized domain knowledge required to be more effective in all work activities -BI & Data-warehousing concepts are must. -Exposure to Azure Synapse will be good -Microsoft Azure Data engineer certified candidates are preferred Mandatory skill sets: Azure Data Engineer/ADF/Azure Data Lake Preferred skill sets: Azure Data Engineer/ADF/Azure Data Lake Years of experience required: 4--8 years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date
Posted 1 week ago
5.0 - 7.0 years
18 - 20 Lacs
pune, bengaluru
Work from Office
The purpose of this role is to provide technical guidance and suggest improvements in development processes. Develop required software features, achieving timely delivery in compliance with the performance and quality standards of the company. Job Description: Key Responsibilities Experience: 5-7 Years Lead the design and implementation of complex data solutions with a business-centric approach. Guide junior developers and provide technical mentorship. Ensure alignment of data architecture with marketing and business strategies. Work within Agile development teams, contributing to sprints and ceremonies. Design and implement CI/CD pipelines to support automated deployments and testing. Apply data engineering best practices to ensure scalable, maintainable codebases. Develop robust data pipelines and solutions using Python and SQL. Understand and manipulate business data to support marketing and audience targeting efforts. Collaborate with cross-functional teams to deliver data solutions that meet business needs. Communicate effectively with stakeholders to gather requirements and present solutions. Follow best practices for data processing and coding standards. Skills Proficient in Python for data manipulation and automation. Strong experience with SQL development (knowledge of MS SQL is a plus). Excellent written and oral communication skills. Deep understanding of business data, especially as it relates to marketing and audience targeting. Experience with Agile methodologies and CI/CD processes Experience with MS SQL. Familiarity with SAS. Good to have B2B and AWS knowledge Nice to have Hands-on experience with orchestration and automation tools such as Snowflake Tasks and Streams. Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
hyderabad
Work from Office
About Us Loop is Indias fastest growing Insurance broker. We help enterprises manage their employee health benefits with transparency, technology, and trust. We re looking for enrolment champions for smooth management of their enrolment and endorsements. Key Responsibilities Coordinate end-to-end onboarding and enrolment of employees under group insurance schemes Liaise with corporate HR teams to collect and verify employee and dependent data Ensure timely and accurate data entry into the insurer s and internal systems Manage mid-term additions, deletions, and endorsement requests Conduct awareness sessions or webinars for employees on insurance benefits and claim processes Troubleshoot queries and coordinate with insurance partners to resolve issues Maintain detailed records of enrolments, timelines, and discrepancies Collaborate with the customer success, sales, and tech teams for process improvement Requirements 3+ years of experience in insurance, HR services, or employee benefits domain Excellent communication and client servicing skills Proficient in MS Excel and insurance or CRM tools Strong organisational skills and ability to meet deadlines Experience handling high-volume data processing IRDAI license (or willingness to acquire one) Preferred Skills Prior experience in a broker or insurance tech startup Exposure to working with enterprise clients or large-scale employee groups What We Offer Competitive salary + incentives Opportunity to work with top enterprise clients Training and development in the health insurance domain Fast-growing team with career growth opportunities
Posted 1 week ago
10.0 - 15.0 years
9 - 12 Lacs
chennai
Work from Office
We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Marine & Coastal Engineer with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. Understanding of wave, hydrodynamics, sediment transport, Coastal storm surge and shoreline evolution. Primary expertise in wave theory, coastal processes, and coastal modelling Ability to contribute to multidisciplinary projects beyond core coastal focus Proficiency in tools such as OpenTELEMAC suite, MIKE 21/3 suite, ADCIRC, CGWAVE, BOUSS-2D, Delft3D, SWAN for simulating waves, currents, storm surge, and sediment dynamics. Experience with structures like breakwaters, revetments, groynes, seawalls, and ripraps. Familiarity with USACE CEM, EurOtop, CIRIA, and the Rock Manual for design and assessment. Ability to process and translate coastal and oceanographic data into actionable insights. Skills in tuning models to match observed data and ensuring reliability of simulations. Experience in executing and managing consultancy or research projects related to coastal engineering. Competence in preparing clear, detailed technical reports, documentation and contribution to technical proposals and support in securing new projects. Ability to present findings and progress to clients, stakeholders, and technical teams. Working effectively with environmental scientists, civil engineers, and planners. Familiarity with geospatial tools for mapping and analyzing coastal features. Knowledge of scripting languages like Python or MATLAB for model automation and data processing. Experience in mooring analysis for floating structures and vessels under wave and current loading using Optimoor/ MIKE-MA. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelors degree (B.Tech) in Civil Engineering and Masters degree (M.Tech) in Ocean Engineering from a recognized institution. Over 10+ years of progressive experience in civil and ocean engineering domain. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. Were building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And were not just talking about it; were doing it. Were reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in todays low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here . Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.
Posted 1 week ago
3.0 - 5.0 years
15 - 16 Lacs
pune, bengaluru
Work from Office
The purpose of this role is to develop required software features, achieving timely delivery in compliance with the performance and quality standards of the company. Job Description: Responsibilities: Design, develop, and maintain robust data pipelines and workflows using Snowflake features including: Stored Procedures Tasks and Streams Snowpark for Python Write optimized and maintainable SQL and Python code for data processing and transformation. Collaborate with cross-functional teams to build and support marketing data products. Ensure data quality, integrity, and governance across all data products. Participate in Agile ceremonies (standups, planning, retrospectives), contributing to sprint planning and delivery. Understand data requirements and deliver scalable solutions. Monitor and optimize data pipeline performance and cost-efficiency in Snowflake. Create technical documentation and maintain Confluence pages for knowledge sharing. Required Skills & Qualifications: 3-5 years of experience in Data Engineering or a related field. Strong expertise in Snowflake development: Stored Procedures Tasks/Streams SQL Snowpark for Python Proficient in Python programming, especially for data manipulation and automation. Hands-on experience with marketing data , attribution models, campaign data, and customer segmentation. Familiarity with the Agile development process and tools. Excellent written and verbal communication skills . Good to Have: Experience with JIRA and Confluence . Exposure to CI/CD practices using GitHub and/or Bitbucket . Understanding of orchestration and automation tools . Knowledge of modern data architectures and cloud-native solutions. Location: DGS India - Bengaluru - Manyata H2 block Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
1.0 - 5.0 years
13 - 17 Lacs
gurugram
Work from Office
About the Role Mastercards Economics Institute is seeking a talented Economic Analyst with R or Python Programmer skills to join our Global Growth and Operations team. Reporting to the Director, Growth and Operations, this individual will blend advanced economic research with strong programming and data visualization skills. This is a unique opportunity for someone passionate about applying economic thinking and technical expertise to real-world questions at the intersection of economics, retail, and commerce. The role involves working on large-scale data analytics, developing innovative economic insights, and building compelling visualizations that help communicate these insights to diverse audiences. Support client and stakeholder engagements across MEI Collaborate with team economists, econometricians, developers, visualization experts and industry partners. Develop and test hypotheses at the intersection of economics, retail, and commerce. Structure work and manage small project streams, delivering impactful results. Identify creative analyses and develop proprietary diagnostic indices using large and complex datasets, including macroeconomic and big data sources. Generate insights, synthesize analyses into impactful storylines and interactive visuals, and help write reports and client presentations. Enhance existing products and partner with internal stakeholders to develop new economic solutions. Help create thought leadership and intellectual capital. Create and format analytical content using Jupyter Notebooks, R Markdown and/or Quarto. Work with databases and data platforms, including Databricks, SQL and Hadoop. Write clear, well-documented code that others can understand and maintain. Collaborate using Git for version control. All About You Bachelors degree in Economics (preferred), Statistics, Mathematics, or a related field. Proficient in working with relational databases and writing SQL queries. Expertise in working with large-scale data processing frameworks and tools, including Hadoop, Apache Spark, Apache Hive, and Apache Impala. Proficient in R or Python with experience in data processing packages. Skilled in creating data visualizations to communicate complex economic insights to diverse audiences. Experience using data visualization tools such as Tableau or Power BI. Proficiency in machine learning, econometric and statistical techniques, including predictive modeling, survival analysis, time series modeling, classification, regression and clustering methods is desirable. Strong problem-solving skills and critical thinking abilities. Excellent communication skills, both written and verbal. Organized and able to prioritize work across multiple projects. Collaborative team player who can also work independently. Passionate about data, technology, and creating impactful economic insights.
Posted 1 week ago
7.0 - 12.0 years
10 - 15 Lacs
bengaluru
Work from Office
We are seeking a highly skilled Senior Lead Data Engineer to join our R&D Data Engineering Team. In this role, you will be a key player in shaping the architecture and technical direction of our data platform, ensuring that it meets the evolving needs of the business while adhering to best practices and industry standards. If you're looking for an opportunity to combine your technical skills with strategic thinking and make a real difference, we want to hear from you. About You experience, education, skills, and accomplishments: Bachelors Degree or equivalent At least 7 years of Relevant Experience At least 5+ Years in Software Development: Demonstrated experience in software development, with a focus on Big Data technologies. At least 3+ Years in Distributed Data Processing: Proven experience in building scalable distributed data processing solutions. At least 3+ Years in Database Design: Expertise in database design and development, with a strong focus on data model design. Strong ProficiencywithApacheSparkandAirflow: Extensive hands-on experience with these technologies, leveraging them for data processing and orchestration. Python Proficiency: Advanced proficiency in Python for data processing and building services. ExperiencewithDatabricksandSnowflake: Practical experience with these platforms, including their use in cloud-based data pipelines. FamiliaritywithDeltaLakeorApache Iceberg: Experience working with these data storage to decouple storage from processing engines. Cloud-Based Solutions Expertise: Proven experience in designing and implementing cloud-based data pipelines, with specific expertise in AWS services such asS3,RDS,EMR, andAWS Glue. CI/CD Best Practices: Strong understanding and application of CI/CD principles It would be great if you also had: Knowledge of Additional Technologies: Familiarity withCassandra,Hadoop,Apache Hive,Jupyter notebooks, BI tools:TableauandPower BI. ExperiencewithPL/SQLandOracle GoldenGate: Additional experience in these areas is advantageous Knowledge on any of these technologies/tools: Cassandra, Hadoop, Apache Hive, Snowflake, Jupiter notebook, Databricks stack, AWS services, EC2, ECS, RDS, EMR, S3, AWS Glue, Airflow What will you be doing in this role? Provide Technical Leadership: Offer strategic guidance on technology choices, comparing different solutions to meet business requirements while considering cost control and performance optimization. Communicate Effectively: Exhibit excellent communication skills, with the ability to clearly articulate complex technical concepts to both technical and non-technical stakeholders. Design and Maintain Data Solutions: Develop and maintain the overall solution architecture for the Data Platform, demonstrating deep expertise in integration architecture and design across multiple platforms at an enterprise scale. Enforce Best Practices: Implement and enforce best practices in Big Data management, from software selection to architecture design and implementation processes. Drive Continuous Improvement: Contribute to the continuous enhancement of support and delivery functions by staying informed about technology trends and making recommendations for improving application services. Lead Technical Investigations: Conduct technical investigations and proofs of concept, both individually and as part of a team, including hands-on coding to provide technical recommendations. Knowledge Sharing: Actively spread knowledge and best practices within the team, fostering a culture of continuous learning and improvement About the Team We are team located in India, US, and Europe. Hours of Work Regular working timing in India.
Posted 1 week ago
5.0 - 10.0 years
16 - 20 Lacs
pune
Work from Office
As a Senior Data Architect, you will be instrumental in shaping the banks enterprise data landscapesupporting teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. You will also serve as the go-to expert and trusted advisor on what good looks like in data architecture, helping to set high standards and drive continuous improvement across the organization. This role is ideal for an experienced data professional with deep technical expertise, strong solution architecture skills, and a proven ability to influence design decisions across both business and technology teams. Responsibilities 1. Enterprise Data Architecture & Solution Design Support teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. Serve as the go-to person for data architecture best practices and standards, helping to define and communicate what good looks like to ensure consistency and quality. Lead and contribute to solution architecture for key programs, ensuring architectural decisions are well-documented, justified, and aligned to enterprise principles. Work with engineering and platform teams to design end-to-end data flows, integration patterns, data processing pipelines, and storage strategies across structured and unstructured data. Drive the application of modern data architecture principles including event-driven architecture, data mesh, streaming, and decoupled data services. 2. Data Modelling and Semantics Provide hands-on leadership in data modelling efforts, including the occasional creation and stewardship of conceptual, logical, and physical models that support enterprise data domains. Partner with product and engineering teams to ensure data models are fit-for-purpose, extensible, and aligned with enterprise vocabularies and semantics. Support modelling use cases across regulatory, operational, and analytical data assets. 3. Architecture Standards & Frameworks Define and continuously improve data architecture standards, patterns, and reference architectures that support consistency and interoperability across platforms. Embed standards into engineering workflows and tooling to encourage automation and reduce delivery friction. Measure and report on adoption of architectural principles using architecture KPIs and compliance metrics. 4. Leadership, Collaboration & Strategy Act as a technical advisor and architectural leader across initiatives mentoring junior architects and supporting federated architecture teams in delivery. Build strong partnerships with senior stakeholders across the business, CDIO, engineering, and infrastructure teams to ensure alignment and adoption of architecture strategy. Stay current with industry trends, regulatory changes, and emerging technologies, advising on their potential impact and application. Skills Extensive experience in data architecture, data engineering, or enterprise architecture, preferably within a global financial institution. Deep understanding of data platforms, integration technologies, and architectural patterns for real-time and batch processing. Proficiency with data architecture tools such as Sparx Enterprise Architect, ERwin, or similar. Experience designing solutions in cloud and hybrid environments (e.g. GCP, AWS, or Azure), with knowledge of associated data services. Hands-on experience with data modelling, semantic layer design, and metadata-driven architecture approaches. Strong grasp of data governance, privacy, security, and regulatory complianceespecially as they intersect with architectural decision-making. Strategic mindset, with the ability to connect architectural goals to business value, and communicate effectively with technical and non-technical stakeholders. Experience working across business domains including Risk, Finance, Treasury, or Front Office functions. Well-being & Benefits Emotionally and mentally balanced: we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected: we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure: : we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays).
Posted 1 week ago
0.0 years
1 - 5 Lacs
hyderabad, bengaluru
Work from Office
Call Handling, Messaging: Answer inbound calls from potential job seeker, listen to their needs, & qualify them. Provide information on WhatsApp. Pass lead to recruitment team for qualified leads - in a professional and timely manner. Work From Home Required Candidate profile Immediate Joiner Work From Home Candidate should be from Hyderabad, New Delhi, Mumbai, Pune, Bangalore,
Posted 1 week ago
3.0 - 7.0 years
14 - 18 Lacs
gurugram
Work from Office
Title and Summary Associate Economics Analyst, Global Growth and Operations-2Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercards efforts to build a more inclusive and sustainable digital economy MEI was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the companys product suites About the Role Mastercards Economics Institute is seeking a talented Economic Analyst with R or Python Programmer skills to join our Global Growth and Operations team. Reporting to the Director, Growth and Operations, this individual will blend advanced economic research with strong programming and data visualization skills. This is a unique opportunity for someone passionate about applying economic thinking and technical expertise to real-world questions at the intersection of economics, retail, and commerce. The role involves working on large-scale data analytics, developing innovative economic insights, and building compelling visualizations that help communicate these insights to diverse audiences. Support client and stakeholder engagements across MEI Collaborate with team economists, econometricians, developers, visualization experts and industry partners. Develop and test hypotheses at the intersection of economics, retail, and commerce. Structure work and manage small project streams, delivering impactful results. Identify creative analyses and develop proprietary diagnostic indices using large and complex datasets, including macroeconomic and big data sources. Generate insights, synthesize analyses into impactful storylines and interactive visuals, and help write reports and client presentations. Enhance existing products and partner with internal stakeholders to develop new economic solutions. Help create thought leadership and intellectual capital. Create and format analytical content using Jupyter Notebooks, R Markdown and/or Quarto. Work with databases and data platforms, including Databricks, SQL and Hadoop. Write clear, well-documented code that others can understand and maintain. Collaborate using Git for version control. All About You Bachelors degree in Economics (preferred), Statistics, Mathematics, or a related field. Proficient in working with relational databases and writing SQL queries. Expertise in working with large-scale data processing frameworks and tools, including Hadoop, Apache Spark, Apache Hive, and Apache Impala. Proficient in R or Python with experience in data processing packages. Skilled in creating data visualizations to communicate complex economic insights to diverse audiences. Experience using data visualization tools such as Tableau or Power BI. Proficiency in machine learning, econometric and statistical techniques, including predictive modeling, survival analysis, time series modeling, classification, regression and clustering methods is desirable. Strong problem-solving skills and critical thinking abilities. Excellent communication skills, both written and verbal. Organized and able to prioritize work across multiple projects. Collaborative team player who can also work independently. Passionate about data, technology, and creating impactful economic insights.
Posted 1 week ago
5.0 - 10.0 years
4 - 8 Lacs
bengaluru
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Stibo Product Master Data Management Good to have skills : Snowflake Data WarehouseMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will play a crucial role in managing data infrastructure and optimizing data workflows. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and maintain data pipelines for efficient data processing.- Implement ETL processes to extract, transform, and load data across systems.- Optimize data workflows and ensure data quality throughout the process.- Collaborate with cross-functional teams to design and deploy data solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Amazon Web Services (AWS), Snowflake Data Warehouse.- Strong understanding of data engineering principles and best practices.- Experience in designing and implementing data solutions using Python.- Knowledge of ETL processes and data migration techniques.- Familiarity with data warehousing technologies and cloud platforms. Additional Information:- The candidate should have a minimum of 5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
10 - 14 Lacs
bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design and development of applications.- Act as the primary point of contact for the project team.- Ensure timely project delivery and quality assurance.- Provide technical guidance and mentorship to team members.- Collaborate with stakeholders to gather requirements and define project scope. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data analytics and data processing.- Experience with cloud-based data platforms.- Knowledge of data modeling and database design.- Hands-on experience with ETL processes and data integration. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 7.0 years
7 - 11 Lacs
bengaluru
Work from Office
We are seeking a Software Engineer (EPM) to design, develop, and optimize financial technology solutions across Enterprise Performance Management (EPM), Tax, and Treasury systems. This role will focus on building scalable integrations, automation, and engineering solutions across platforms like OneStream, Kyriba, and tax engines to enhance financial operations. You will collaborate with product owners, solution and platform architects, finance and engineering teams to drive innovation, streamline financial processes, and ensure compliance. If you have a strong background in .NET development, API integrations, and enterprise financial platforms, wed love to hear from you! About You Experience, Education, Skills, and Accomplishments Bachelors or Masters degree in Computer Science, Software Engineering, Information Systems, or a related field. years of experience developing and integrating enterprise financial technology solutions. Proven experience with OneStream EPM, Kyriba Treasury and Tax platforms (NetSuite and Oracle ERP are a plus but not required). Expertise in .NET development, REST APIs, and cloud-based financial integrations. Experience building financial workflow automation and process optimization solutions. Strong background in API development, financial data processing, payment integrations, and tax compliance automation. Familiarity with SQL, JSON, XML, and data transformation techniques. Ability to troubleshoot and optimize integrations across enterprise financial systems. It Would Be Great If You Also Had: Experience with M&A, system integration and ERP migrations. Familiarity with cloud-based ERP transformations and SaaS financial platforms. What Will You Be Doing in This Role? Develop & optimize integrations between EPM, Tax, Treasury platforms with enterprise financial systems. Design & implement API-based financial automation for tax calculations, payment processing, and treasury operations. Enhance OneStream EPM capabilities by developing custom solutions for financial planning and consolidations. Support treasury automation projects, integrating Kyriba with banking partners for payments and cash management. Troubleshoot & optimize system performance, ensuring data accuracy, compliance, and scalability. Collaborate with product managers, finance teams, and engineering teams to define technical solutions that align with business objectives. Ensure adherence to governance, security, and compliance requirements for financial data processing. Document integration workflows, engineering solutions, and troubleshooting guides. Product You Will Be Developing As a Software Engineer, you will focus on developing, customizing, and enhancing financial systems, including Tax, Treasury, and Enterprise Performance Management (EPM) platforms. Your work will involve building scalable integrations, automation, and engineering solutions within platforms like OneStream EPM, Kyriba Treasury, and tax engines to improve financial operations. About the Team You will work closely with finance, tax and engineering teams. The team is responsible for enterprise financial automation, ERP integrations, ensuring scalability and efficiency in finance operations.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |