Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have at least 4+ years of proven experience in designing, implementing, and executing ETL testing procedures. Your role will involve validating data transformations to ensure the accuracy of data movement across systems. It is essential that you possess strong expertise in testing database systems for data accuracy, integrity, and performance. You will be responsible for creating and executing database test scripts and queries. In-depth knowledge and hands-on experience in testing data warehouses are required for this role. You must be able to validate data extraction, loading, and transformation processes within a data warehouse environment. Proficiency in writing and optimizing SQL queries for data validation and analysis is a key skill needed. Experience in using SQL to retrieve, manipulate, and compare data from various databases is also expected. Experience with ETL tools such as Informatica, Talend, or similar is necessary. Familiarity with test automation tools and frameworks would be a plus. If you meet these qualifications and are interested in this Full-time position based in Gurugram/Noida, India, please contact us at the provided address: 450 Century Pkwy Suite 250, Allen, TX 75013, USA or call us at +1 (469) 570-8638.,
Posted 2 days ago
4.0 - 8.0 years
4 - 8 Lacs
Pune, Maharashtra, India
On-site
Qualifications : Must have a degree graduate & above, full-time education. Must have experience in Power BI Design, Development & Implementation (end-to-end). Must have experience in Client Communication and Requirement gathering. Able to discuss the requirements effectively with the client teams, and with internal teams. End-to-end solution design, stakeholder engagement, and delivery. Strong experience in SDLC in both waterfall and agile life cycles. Expert-level skills in Data Extraction, Data Transformations, Data Mapping, Data Analysis, Data Modelling, Security, Optimization, and Tuning. Recognize business requirements in the context of BI and create data models to transform raw data into relevant insights. Use Power BI to run DAX queries and functions. Be familiar with MS SQL Server BI Stack tools and technologies, such as SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Proficiency in creating and optimizing Power Query transformations and Power Pivot data models to enhance data processing and analytics within Power BI. Ability to work on complex scenarios along with an optimized way of coding. Certifications in BI is an added advantage.
Posted 1 week ago
9.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
We are seeking a Python Tech Lead with 9 to 14 years of experience in Python development and exposure to AWS cloud for a Full Time Employment position. The ideal candidate will have deep expertise in Python development, with hands-on experience in AWS cloud. As a Python AWS Lead, you will be responsible for application development using Microservices or REST APIs, data processing, transformations, migrations, and building API-based integration solutions. Key Responsibilities: - Deep understanding of Python development and lead experience - Hands-on experience with Python programming and AWS cloud - Application development using Microservices or REST APIs - Experience in data processing, transformations, migrations - Knowledge of user authentication and authorization protocols - Building API-based integration solutions with third-party cloud/on-prem applications - Working knowledge of Flask and Django frameworks - Source code versioning using GIT/SVN, CICD using Jenkins/Gitlab/Google services, databases like Oracle, MySQL, Teradata - Strong work ethic and interpersonal skills - Familiarity with NOSQL databases like Mongo DB, Couch DB - Proficiency in tools like JIRA, Confluence - Understanding of Agile methodologies and prioritization skills Benefits: - Competitive salary and benefits package - Talent development focus with quarterly promotion cycles and company-sponsored education/certifications - Exposure to cutting-edge technologies - Employee engagement initiatives and flexible work hours - Annual health check-ups - Insurance coverage for self, spouse, children, and parents Inclusive Environment: Persistent Ltd. is committed to fostering diversity and inclusion in the workplace. We encourage applications from all qualified individuals, including those with disabilities and regardless of gender or gender preference. Our company offers hybrid work options, flexible hours, and accessible facilities for employees with physical disabilities. We strive to create an inclusive environment where all employees can thrive. At Persistent, we aim to provide a values-driven and people-centric work environment that enables employees to grow both professionally and personally. Join us to make a positive impact using the latest technologies, enjoy collaborative innovation, and unlock global opportunities for learning and development. Persistent is an Equal Opportunity Employer that prohibits discrimination and harassment in any form. Let's unleash your full potential at Persistent.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
Are you innovative and passionate about building secure and reliable solutions We are looking for Data Engineers and DevSecOps Engineers to join our team in building the Enterprise Data Mesh at UBS. We are open to adapting the role suited to your career aspirations and skillset. Responsibilities include: Designing, documenting, developing, reviewing, testing, releasing, and supporting Data Mesh components, platforms, and environments. Contributing to agile ceremonies such as daily stand-ups, backlog refinement, iteration planning, iteration reviews, and retrospectives. Ensuring compliance with the firm's applicable policies and processes. Collaborating with other teams and divisions using Data Mesh services, related guilds, and other Data Mesh Services teams. Ensuring delivery deadlines are met. You will be part of a diverse global team consisting of data scientists, data engineers, full-stack developers, DevSecOps engineers, and knowledge engineers within Group CTO. The team primarily works locally with some interactions with other teams and divisions. We provide various services as part of our Data Mesh strategy firmwide to automate and scale data management, improving time-to-market for data and reducing data downtime. You will have access to learning opportunities and a varied technology landscape including Azure Cloud, AI (ML and GenAI models), web user interface (React), data storage (Postgres, Azure), REST APIs, Kafka, Great Expectations, and ontology models. Experience in hands-on delivery in data transformations, Spark, Python, database design and development, CI/CD pipelines, security risk mitigation, infrastructure as code (e.g., Terraform), monitoring, and Azure development is required. Proficiency in agile software practices and tools, performance testing, unit and integration testing, identifying root causes, designing and implementing solutions, collaborating with other teams, and reskilling in new technologies is essential. UBS is the world's largest and the only truly global wealth manager, operating through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in all major financial centers in more than 50 countries, UBS stands out due to its global reach and expertise. UBS is an Equal Opportunity Employer that values and seeks to empower each individual while supporting diverse cultures, perspectives, skills, and experiences within its workforce.,
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
As an experienced Power BI Architect with knowledge of Microsoft Fabric Solutions, you will be responsible for leading the design, development, and implementation of innovative Business Intelligence (BI) solutions. Your expertise in enterprise data architecture, analytics platforms, and data integration strategies will be crucial in optimizing data pipelines and enhancing performance through the effective use of Power BI and Microsoft Fabric. Your key responsibilities will include developing comprehensive Power BI solutions such as dashboards, reports, and data models to meet business requirements. You will lead the end-to-end development lifecycle of BI projects, from requirement gathering to deployment, ensuring optimal performance. Utilizing Microsoft Fabric, you will streamline data pipelines, integrate data engineering, storage, and processing capabilities, and enhance performance and scalability by integrating Power BI with Microsoft Fabric. Your role will also involve working with Azure Data Services like Azure Data Lake, Azure Synapse, and Azure Data Factory to support BI architecture. Implementing best practices in Power BI development, providing leadership and mentorship to a team of developers, overseeing project management tasks, and collaborating with data engineers and stakeholders to translate business requirements into scalable BI solutions will be part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Analytics, or a related field. You must have 10-15 years of experience in BI development, including at least 3 years in a leadership position. Proven experience with Power BI, Microsoft Fabric, and Azure Data Services is also required for this position.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
As an experienced Power BI Architect with extensive knowledge of Microsoft Fabric, you will be responsible for leading the design, development, and implementation of innovative Business Intelligence (BI) solutions. Your expertise in enterprise data architecture, analytics platforms, and data integration strategies will be crucial in optimizing data pipelines and driving performance and scalability through the effective use of Power BI and Microsoft Fabric. Your key responsibilities will include developing comprehensive Power BI solutions such as dashboards, reports, and data models to meet business needs. You will lead the entire lifecycle of BI projects, from requirement gathering to deployment, ensuring optimal performance. Utilizing Microsoft Fabric, you will streamline data pipelines by integrating data engineering, data storage, and data processing capabilities. Integration of Power BI with Microsoft Fabric will be essential for improved performance, scalability, and efficiency. Your role will also involve working with Azure Data Services (e.g., Azure Data Lake, Azure Synapse, Azure Data Factory) to support the BI architecture. Establishing and implementing best practices in Power BI development, including DAX functions, data transformations, and data modeling, will be part of your responsibilities. Additionally, you will lead and mentor a team of Power BI developers, ensuring high-quality output and adherence to best practices. You will oversee task prioritization, resource allocation, and project timelines to ensure timely and successful delivery of BI solutions. Collaboration with data engineers and stakeholders to translate business requirements into functional, scalable BI solutions will be crucial. Driving BI initiatives to ensure alignment with business goals and objectives will also be a key aspect of your role. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Systems, Data Analytics, or a related field. You should have 10-15 years of experience in BI development, with at least 3 years in a leadership role. Proven experience with Power BI, Microsoft Fabric, and Azure Data Services will be essential for success in this role.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
chandigarh
On-site
Are you eager to launch your career as an SAP Business Warehouse BW consultant Look no further! Join our team at Saber Well Consulting as an Associate Consultant and embark on a journey of growth, learning, and professional development in the exciting world of SAP BW4HANA. You will collaborate closely with experienced consultants to assist in SAP BW4HANA implementations, gaining hands-on experience in configuring, optimizing, and maintaining SAP BW4HANA solutions. Your role will involve working closely with clients to understand their business requirements and contribute to crafting data solutions. Additionally, you will participate in the development and enhancement of data models, data flows, and data transformations, as well as support data extraction, loading, and validation processes. Providing assistance to end-users for data reporting and analysis needs and actively contributing to troubleshooting and resolving SAP BW4HANA-related issues are also key responsibilities. To be successful in this role, you should have experience working in SAP BW, a Bachelor's degree in a relevant field, enthusiasm, and a strong desire to learn and grow in the field of SAP BW4HANA. Good analytical and problem-solving skills, excellent communication and team collaboration abilities, as well as the ability to adapt quickly in a fast-paced environment are essential requirements. Onsite presence from Day 1 is mandatory. Preferred qualifications include completed coursework or training related to SAP BW4HANA, familiarity with SAP data integration technologies, exposure to data visualization tools such as SAP BusinessObjects and Tableau, and progress toward SAP BW4HANA certification. We offer a structured training and mentorship program designed for entry-level consultants, exciting opportunities for skill development and career advancement, a competitive rate, and a supportive and inclusive work environment that encourages innovation and collaboration. If you are ready to begin your career in SAP BW4HANA consulting and join a dynamic team dedicated to your success, apply today by sending your resume and a cover letter to sapbwconsltnt@gmail.com with the subject line "SAP BW4HANA Associate Consultant Application." At Saber Well Consulting, we are committed to helping you build a successful career in SAP BW4HANA consulting. Join us, and let's shape the future of data solutions together! This is a full-time position with a schedule of Monday to Friday, night shift, and weekend availability. A joining bonus is provided, and the work location is in person.,
Posted 2 weeks ago
7.0 - 11.0 years
0 - 0 Lacs
haryana
On-site
As a Data Scientist with 7-8 years of experience, you will be responsible for utilizing your expertise in AWS cloud and machine learning within the health domain. You will primarily work on processing large datasets, conducting exploratory data analysis, and extracting valuable insights to support decision-making processes. Your proficiency in Python or R along with data visualization methods will be crucial for this role. The ideal candidate should have a strong background in data analysis and be adept at handling complex datasets. Familiarity with healthcare or mental health datasets is preferred, along with a solid understanding of database querying and data transformations. Your problem-solving skills and meticulous attention to detail will play a key role in deriving actionable insights from the data. In addition to technical skills, effective communication is essential for this role. You will be required to communicate your findings and recommendations clearly and collaborate with team members effectively. While working independently when necessary, a collaborative attitude will enhance your contribution to the team. Preferred qualifications include prior experience in healthcare or mental health datasets, a background in statistical modeling or machine learning, and familiarity with socio-demographic data analysis. If you are a data-driven professional looking to make a meaningful impact in the healthcare domain, this role offers an exciting opportunity to apply your skills and expertise effectively.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled Data Engineer to join our team, working on end-to-end data engineering and data science use cases. The ideal candidate will have strong expertise in Python or Scala, Spark (Databricks), and SQL, building scalable and efficient data pipelines on Azure. Responsibilities include designing, building, and maintaining scalable ETL/ELT data pipelines using Azure Data Factory, Databricks, and Spark. Developing and optimizing data workflows using SQL and Python or Scala for large-scale data processing and transformation. Implementing performance tuning and optimization strategies for data pipelines and Spark jobs to ensure efficient data handling. Collaborating with data engineers to support feature engineering, model deployment, and end-to-end data engineering workflows. Ensuring data quality and integrity by implementing validation, error-handling, and monitoring mechanisms. Working with structured and unstructured data using technologies such as Delta Lake and Parquet within a Big Data ecosystem. Contributing to MLOps practices, including integrating ML pipelines, managing model versioning, and supporting CI/CD processes. Primary Skills required are Data Engineering & Cloud proficiency in Azure Data Platform (Data Factory, Databricks), strong skills in SQL and either Python or Scala for data manipulation, experience with ETL/ELT pipelines and data transformations, familiarity with Big Data technologies (Spark, Delta Lake, Parquet), expertise in data pipeline optimization and performance tuning, experience in feature engineering and model deployment, strong troubleshooting and problem-solving skills, experience with data quality checks and validation. Nice-to-Have Skills include exposure to NLP, time-series forecasting, and anomaly detection, familiarity with data governance frameworks and compliance practices, basics of AI/ML like ML & MLOps Integration, experience supporting ML pipelines with efficient data workflows, knowledge of MLOps practices (CI/CD, model monitoring, versioning). At Tesco, we are committed to providing the best for our colleagues. Total Rewards offered at Tesco are determined by four principles - simple, fair, competitive, and sustainable. Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays. Tesco promotes programs supporting health and wellness, including insurance for colleagues and their family, mental health support, financial coaching, and physical wellbeing facilities on campus. Tesco in Bengaluru is a multi-disciplinary team serving customers, communities, and the planet. The goal is to create a sustainable competitive advantage for Tesco by standardizing processes, delivering cost savings, enabling agility through technological solutions, and empowering colleagues. Tesco Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India, dedicated to various roles including Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and others.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. We are seeking a talented and driven Power BI Developer to join our team. The ideal candidate will be responsible for collecting, analysing, and interpreting complex data sets to drive informed business decisions. You will work closely and directly with the Client and cross-functional teams to identify trends, patterns, and insights that will contribute to our company's growth. In this role, you will play a key role in developing, designing, and maintaining Power BI dashboards and reports to provide actionable insights. You will collaborate with business stakeholders to understand their data requirements and translate them into technical specifications. Additionally, you will implement data models, data transformations, and data visualizations using Power BI. The ideal candidate should have a minimum of 5 years of experience in Power BI Development. You will be required to automate data extraction, transformation, and loading (ETL) processes to ensure efficient data flow. Moreover, you will integrate Power BI with other data sources and systems to create comprehensive reporting solutions. You will also be responsible for optimizing Power BI performance and troubleshooting issues as they arise, ensuring data accuracy, consistency, and security in all reports and dashboards. At Capgemini, you will receive comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work arrangements. We are committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. You will have the opportunity to work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world. With over 55 years of heritage, Capgemini is trusted by its clients to unlock the value of technology and address the entire breadth of their business needs. The company delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, cloud, and data, combined with deep industry expertise and a strong partner ecosystem.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be responsible for building systems and APIs to collect, curate, and analyze data generated by biomedical dogs, devices, and patient data. Your immediate requirements will include developing APIs and backends to handle Electronic Health Record (EHR) data, time-series sensor streams, and sensor/hardware integrations via REST APIs. Additionally, you will work on data pipelines and analytics for physiological, behavioral, and neural signals, as well as machine learning and statistical models for biomedical and detection dog research. You will also be involved in web and embedded integrations connecting software to real-world devices. To excel in this role, you should have familiarity with domains such as signal processing, basic statistics, stream processing, online algorithms, databases (especially time series databases like victoriametrics, SQL including postgres, sqlite, duckdb), computer vision, and machine learning. Proficiency in Python, C++, or Rust is essential, as the stack primarily consists of Python with some modules in Rust/C++ where necessary. Firmware development is done in C/C++ (or Rust), and if you choose to work with C++/Rust, you may need to create a Python API using pybind11/PyO3. Your responsibilities will involve developing data pipelines for real-time and batch processing, as well as building robust APIs and backends for devices, research tools, and data systems. You will handle data transformations, storage, and querying for structured and time-series datasets, evaluate and enhance ML models and analytics, and collaborate with hardware and research teams to derive insights from messy real-world data. The focus will be on ensuring data integrity and correctness rather than brute-force scaling. If you enjoy creating reliable software and working with complex real-world data, we look forward to discussing this opportunity with you. Key Skills: backend development, computer vision, data transformations, databases, analytics, data querying, C, Python, C++, signal processing, data storage, statistical models, API development, Rust, data pipelines, firmware development, stream processing, machine learning,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You will be joining a trusted global innovator of IT and business services as a Quadient Inspire Developer. Your role will involve designing and developing CCM solutions using Quadient Inspire suite to help clients transform through digital and IT modernization. You will be responsible for creating document templates and workflows, setting up Interactive modules, and utilizing data modules for workflow creation within the Inspire Designer. Your expertise in writing rules, scripts, and data transformations will be crucial for form development based on business specifications. Additionally, you will actively participate in architecture discussions, gather requirements, and collaborate with various stakeholders for software design sessions. Furthermore, your role will entail integrating Quadient Inspire suite within the software lifecycle, supporting document processing and output operations, and managing data exchange with upstream and downstream systems using platforms like Unix, Windows, and Linux. Proficiency in Oracle DB, SQL queries, and monitoring tools like IAScaler Dashboard will be essential for ensuring seamless operation and output delivery. To excel in this role, you are required to have 4 to 5 years of experience working with Quadient Inspire or similar document generation tools. Strong programming skills in languages such as Java or Python, along with experience in scripting languages like JavaScript and TypeScript, will be advantageous. Your ability to troubleshoot and attention to detail, coupled with excellent communication skills, will enable you to work independently and collaboratively in a team environment. Moreover, experience with other document generation tools such as DocuSign and Adobe LiveCycle, knowledge of database management, SQL, web services, and APIs will be considered as preferred skills. Your role as a Quadient Inspire Developer will play a significant role in contributing to the success of clients and society as they transition into the digital future.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The ideal Integration Data Engineer is a dynamic contributor, passionate about software development, and dedicated to making a meaningful impact within a team of smart, creative individuals. The successful candidate will be part of a team responsible for the development and support of integration solutions connecting modern SaaS including Salesforce, Workday, Workfront, Microsoft D365, traditional back-office systems, and strategic data platforms. We seek a candidate experienced in data integrations, data transformations, query/database optimization, consuming web services, and working with APIs. The candidate must be technically proficient with strong interpersonal, troubleshooting, and documentation skills. We need our engineers to be versatile, display leadership qualities, and be enthusiastic about taking on new problems. This role offers the opportunity to work on exciting projects and make a significant impact within the organization. Key Responsibilities: - Review Business Requirements to identify integration needs and design, develop, and test complex integration solutions. - Develop and maintain scalable data pipelines and build out new API integrations for data transfer. - Engage with key stakeholders to understand and define integration requirements. - Collaborate with architects and senior technical leads to create and enhance complex integration components. - Understand and document data structures and business rules within source systems. - Contribute to the analysis, design, development, and delivery of integration projects by performing complex data mapping and data conversion activities. - Collaborate with other teams in Data Engineering and other Technology functions to deliver secure, reliable, robust, and scalable integrated solutions. - Perform quality assurance and testing at the unit level. - Troubleshoot and take ownership of issues in development, test, and production environments, including performance optimization and continuous tuning. - Continuously learn and evaluate the latest development methods, tools, and technologies. - Demonstrate strong problem-solving ability, logic, and analytical skills. - Manage multiple projects in a fast-paced environment. Key Skills and Experience: - 3-5 years of experience as an integration engineer. - Demonstrable experience within data engineering and data-related projects. - Hands-on project delivery experience within several data-related initiatives, including data design, data mapping, and data quality assessment. - Experience building integrations with at least one integration platform e.g. SnapLogic, Boomi. - Proficient in one or more data-related programming languages. - Experienced in Agile development. - Knowledge of cloud platforms such as Azure and AWS. - Experience with AI technologies and machine learning frameworks is a plus, as it will enhance our ability to innovate and improve our integration solutions. - Driven to continually learn about and incorporate new technologies. - Thrive in a self-driven environment. - Strong interpersonal and communication skills. Location: Mumbai Brand: Dentsu Time Type: Full time Contract Type: Permanent,
Posted 1 month ago
1.0 - 6.0 years
10 - 14 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Responsibilities Gathering the requirements from business experts, analysing, designing, and developing reports using Power BI, and demonstrating valuable insights. Processing, cleansing, and verifying the integrity of data used for analyses Developing insightful and interactive business intelligence reports and dashboards Skills and Qualifications Must have: At least 2 years of experience developing Power BI dashboards Experience in publishing dashboards and reports to external customers Proficiency with data analysis using DAX queries including Time Intelligence functions Experience in Power BI administration and security Experience in integrating Power BI reports into web applications using Power BI Service. Experience in conceptualizing, templating, and transforming traditional reports to analytical dashboards as part of the digital transformation process Expertise in performing in-depth data analysis using Microsoft Excel and its advanced functions Degree in Computer Science, Mathematics, Statistics, or other related technical fields with equivalent practical experience. Excellent English verbal and written communication skills. Nice to have: Experience in data acquisition, performing data transformations, data aggregations using SQL, Python. Familiarity with Microsoft Azure services and tools is a plus
Posted 1 month ago
3.0 - 7.0 years
8 - 18 Lacs
Pune, Bengaluru
Work from Office
Mulesoft Developer - Con - BLR / Pune - J49091 Key Responsibilities: Design and implement integration solutions using MuleSoft Anypoint Platform, ensuring high performance, scalability, and reliability. Develop APIs and Mule applications based on business requirements. Collaborate with stakeholders to understand integration needs and deliver high-quality solutions. Build and maintain RESTful and SOAP web services. Perform integration testing and ensure seamless connectivity between internal and external systems. Troubleshoot and resolve integration issues related to performance, connectivity, and data inconsistencies. Contribute to the creation of reusable assets, templates, and best practices for integration development. Participate in code reviews to ensure code quality, performance, and security standards are met. Provide guidance and mentoring to junior developers on MuleSoft best practices and development techniques. Ensure proper documentation of API interfaces, technical specifications, and deployment procedures. Work in an Agile environment, collaborating with cross-functional teams, including business analysts, QA, and infrastructure teams. Technical Expertise: Proficient in building and deploying APIs, integrations using MuleSoft. Strong knowledge of MuleSoft connectors, data transformations. Experience with RESTful APIs, SOAP, Web Services, JMS, and other integration protocols. Familiarity with Anypoint Studio, API Manager, Anypoint Exchange, and CloudHub. Good understanding of Java, XML, JSON, and other technologies related to integration. Knowledge of databases (SQL/NoSQL) and experience with data integration. Certifications: MuleSoft Certified Developer (Mule 4) is preferred. Tools: Familiarity with CI/CD tools like Jenkins, Git, Maven, etc. Additional Skills (Preferred): Experience with cloud platforms like AWS, Azure, or GCP. Knowledge of Service-oriented Architecture (SOA) and event-driven architecture (EDA). Experience in designing API-first solutions and working with API Gateways. Familiarity with security protocols (OAuth, JWT, SAML). Qualification BE-Comp/IT,BE-Other,BTech-Comp/IT,BTech-Other,MBA,MCA
Posted 2 months ago
3.0 - 7.0 years
8 - 18 Lacs
Pune, Bengaluru
Work from Office
Mulesoft Developer - Con - BLR / Pune - J49091 Key Responsibilities: Design and implement integration solutions using MuleSoft Anypoint Platform, ensuring high performance, scalability, and reliability. Develop APIs and Mule applications based on business requirements. Collaborate with stakeholders to understand integration needs and deliver high-quality solutions. Build and maintain RESTful and SOAP web services. Perform integration testing and ensure seamless connectivity between internal and external systems. Troubleshoot and resolve integration issues related to performance, connectivity, and data inconsistencies. Contribute to the creation of reusable assets, templates, and best practices for integration development. Participate in code reviews to ensure code quality, performance, and security standards are met. Provide guidance and mentoring to junior developers on MuleSoft best practices and development techniques. Ensure proper documentation of API interfaces, technical specifications, and deployment procedures. Work in an Agile environment, collaborating with cross-functional teams, including business analysts, QA, and infrastructure teams. Technical Expertise: Proficient in building and deploying APIs, integrations using MuleSoft. Strong knowledge of MuleSoft connectors, data transformations. Experience with RESTful APIs, SOAP, Web Services, JMS, and other integration protocols. Familiarity with Anypoint Studio, API Manager, Anypoint Exchange, and CloudHub. Good understanding of Java, XML, JSON, and other technologies related to integration. Knowledge of databases (SQL/NoSQL) and experience with data integration. Certifications: MuleSoft Certified Developer (Mule 4) is preferred. Tools: Familiarity with CI/CD tools like Jenkins, Git, Maven, etc. Additional Skills (Preferred): Experience with cloud platforms like AWS, Azure, or GCP. Knowledge of Service-oriented Architecture (SOA) and event-driven architecture (EDA). Experience in designing API-first solutions and working with API Gateways. Familiarity with security protocols (OAuth, JWT, SAML). Qualification BE-Comp/IT,BE-Other,BTech-Comp/IT,BTech-Other,MBA,MCA
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |