Jobs
Interviews

1655 Adf Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Job Title : Senior Developer / Tech Lead Location : Trivandrum/ Kochi Experience : 6+ Years Key Skills : C#, .NET Core, Microservices, Web API, Azure Data Factory, Azure Services Notice Period: Immediate to 30days Job Purpose We are looking for a highly skilled Senior Developer / Tech Lead to join our team in Trivandrum. The ideal candidate will be responsible for designing, developing, and delivering high-quality cloud-based enterprise applications using modern technologies and practices. You will collaborate with product owners, architects, and development teams to create scalable, maintainable, and secure solutions. Duties and Responsibilities Collaborate with product owners and architects to understand business and technical requirements and design scalable solutions. Design, develop, and deploy cloud-based applications with clearly defined DevOps and release processes. Build microservices-based architectures using C#, .NET Core, and Azure services. Apply Test Driven Development (TDD) practices for building robust, testable code. Perform code reviews, troubleshoot technical issues, and provide mentorship to team members. Continuously improve code quality and development practices. Identify and manage project risks and constraints, proactively driving resolution. Solve complex performance and scalability issues. Create and maintain technical documentation. Lead the development of Proof of Concepts (POCs) to validate solution approaches and reduce technical risks. Work within Agile/Scrum frameworks for iterative development and delivery. Skills and Competencies Must-Have: Strong hands-on experience (6+ years) with: C#, .NET Core, Microservices, Web API Azure Services : Azure Service Bus, AKS, Azure Functions, Azure Data Factory (ADF) – pipelines, data flows, triggers, linked services Databases : SQL Server, Azure SQL, Cosmos DB ETL Processes , Data Lake, Blob Storage APIs & Data Formats : RESTful APIs, JSON, XML Experience with CI/CD pipelines and DevOps practices. Strong object-oriented programming (OOP) knowledge and experience in designing complex entity relationships. Familiarity with Docker , Kubernetes , and working in cloud-native environments. Responsive web development and cross-platform application architecture experience. Ability to estimate efforts accurately and define clear milestones. Prior experience in the Retail domain . Excellent communication skills – both written and verbal. Optional/Nice-to-Have: Experience with Oracle Fusion Cloud migration – including data extraction, transformation, and enterprise system integration. Basic knowledge of Finance and Accounting – specifically in migrating enterprise systems, such as chart of accounts, sub-ledgers, and financial reporting. Working Style & Tools Agile/Scrum methodologies DevOps pipelines and cloud-based deployment Collaborative team environment with frequent code reviews Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title Data Engineer - Assistant Manager Job Description Job title: Data Engineer - Deputy Manager Location : Chennai Your role: The Data Engineering manager needs to be well versed with Microsoft Business Intelligence stack having strong skills and experience in development and implementation of BI and advanced analytics solutions as per business requirements. Strong hands-on experience in Microsoft ADF pipelines, databricks notebooks, Pyspark. Adept in design and development of data flows using ADF. Expertise in implementing complex ETL logics through databricks notebooks Experience in implementing CI-CD pipelines through azure devops Experience in writing complexT- SQLs Understand the business requirements and develop data models accordingl Have knowledge and experience in prototyping, designing, and requirement analysis. Excellent knowledge in data usage, Scheduling, Data Refresh and diagnostics Experience in tools such as Microsoft Azure, SQL data warehouse, Visual Studio, etc. Worked in an agile (scrum) environment with globally distributed teams Analytical bent of mind Business acumen and articulation skills; ability to capture business needs and translate into a solution Ability to manage interaction with business stakeholders and others within the organization Good communication and documentation skills Proven experience in interfacing with different source systems Proven experience in data modelling Minimum required Education: BE Computer Science / MCA / MSC IT Minimum required Experience: Minimum 2 years of experience in Data Engineering or equivalent with Bachelor's Degree. Preferred Certification: Azure ADF/Databricks/T-SQL Preferred Skills: Azure ADF/Databricks PySpark / T-SQL Data Governance Data Harmonization & Processing Data Quality Assurance Business Intelligence Tools Requirements Analysis Root Cause Analysis (RCA) Requirements Gathering How We Work Together We believe that we are better together than apart. For our office-based teams, this means working in-person at least 3 days per week. Onsite roles require full-time presence in the company’s facilities. Field roles are most effectively done outside of the company’s main facilities, generally at the customers’ or suppliers’ locations. Indicate if this role is an office/field/onsite role. About Philips We are a health technology company. We built our entire company around the belief that every human matters, and we won't stop until everybody everywhere has access to the quality healthcare that we all deserve. Do the work of your life to help the lives of others. Learn more about our business. Discover our rich and exciting history. Learn more about our purpose. If you’re interested in this role and have many, but not all, of the experiences needed, we encourage you to apply. You may still be the right candidate for this or other opportunities at Philips. Learn more about our commitment to diversity and inclusion here. Show more Show less

Posted 1 month ago

Apply

4.0 - 10.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Greetings! TCS India presents excellent opportunities for IT professionals. Mega Walk-in Drive for Bhubaneswar Location Role :- Azure Data Engineer Experience:- 4-10 years Location:- Bhubaneswar Required Technical Skill Set:- Azure, ADF, ADB, Synapse & Pyspark Date:- Saturday, 21st June, 2025 Time:- 9:30 AM to 11 AM Show more Show less

Posted 1 month ago

Apply

4.0 - 10.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Greetings! TCS India presents excellent opportunities for IT professionals. Mega Walk-in Drive for Bhubaneswar Location Role :- Azure Data Engineer Experience:- 4-10 years Location:- Bhubaneswar Required Technical Skill Set:- Azure, ADF, ADB, Synapse & Pyspark Date:- Saturday, 21st June, 2025 Time:- 9:30 AM to 11 AM Show more Show less

Posted 1 month ago

Apply

8.0 years

1 - 3 Lacs

Cochin

On-site

Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 1 month ago

Apply

7.0 years

0 Lacs

Hyderābād

On-site

Digital Solutions Consultant I - HYD015Q Company : Worley Primary Location : IND-AP-Hyderabad Job : Digital Solutions Schedule : Full-time Employment Type : Agency Contractor Job Level : Experienced Job Posting : Jun 16, 2025 Unposting Date : Jul 16, 2025 Reporting Manager Title : Senior General Manager : We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities: Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 1 month ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley. Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting Jun 16, 2025 Unposting Date Jul 16, 2025 Reporting Manager Title Senior General Manager Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary A career in our Microsoft Dynamics team will provide the opportunity to help our clients transform their technology landscape across Front, Back and Mid-Office functions leveraging Microsoft Dynamics. We focus on contributing to PwC’s value proposition of “strategy led and technology enabled”, by aligning our Consulting Solutions’ industry focus with the Microsoft technologies such as Dynamics 365, Azure, Power Platform and Power BI. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: · Experience as a Data Analyst with high proficiency; · Expertise in writing and optimizing SQL Queries in SQL Server and/or Oracle; · Experience in Data Extraction, Transformation and Loading (ETL) using SSIS and ADF; · Experience in PowerBI and/or Tableau for visualizing and analyzing data; · Having knowledge in Database Normalization for optimum performance; · Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and VBA Macros; · Knowledge about data warehousing concepts · Performance optimization and troubleshooting capabilities · Good Project Management Skills- Client Meetings, Stakeholder Engagement · Familiarity with Agile Methodology · Strong knowledge in Azure DevOps Boards, Sprint, Queries, Pipelines (CI/ CD) etc. Mandatory skill sets: ADF, Power BI Preferred skill sets: Devops/CI/CD Years of experience required: 3-7 years Education qualification: B.Tech/B.E Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 month ago

Apply

5.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Greetings from Kellton Tech!! Job Title : Java & ADF Developer / Java with Springboot Location : Hyderabad (Onsite – Client Location) Experience : 5-12 years Employment Type : Full-time / Contract (as applicable) Joining : Immediate to 30 days preferred About Kellton: We are a global IT services and digital product design and development company with subsidiaries that serve startup, mid-market, and enterprise clients across diverse industries, including Finance, Healthcare, Manufacturing, Retail, Government, and Nonprofits. At Kellton, we believe that our people are our greatest asset. We are committed to fostering a culture of collaboration, innovation, and continuous learning. Our core values include integrity, customer focus, teamwork, and excellence. To learn more about our organization, please visit us at www.kellton.com Are you craving a dynamic and autonomous work environment? If so, this opportunity may be just what you're looking for. At our company, we value your critical thinking skills and encourage your input and creative ideas to supply the best talent available. To boost your productivity, we provide a comprehensive suite of IT tools and practices backed by an experienced team to work with. Req 1: Java with Springboot Technical Skills: Java (should also be able to work on older versions – Versions 7 & 8) Spring Boot, Spring JPA, Spring Security MySQL IDEs: Primarily NetBeans, also Eclipse Jasper Reports Application Servers: Tomcat, JBoss (WildFly) Basic knowledge of Linux Day-to-Day Responsibilities: Handling API-related issues and bug fixes Developing new APIs and features as per business requirements Coordinating and deploying builds in UAT environments Collaborating with the QA and product teams to ensure smooth releases Addl Skillset Info: Java , Spring Boot , Hibernate , Junit , JWT, OAuth, Redis, Docker , Kafka (Optional) , Open api standards , Jenkins/Git Pipeline, etc Req:2 Java & Oracle ADF Developer About the Role: We are looking for a skilled Java and Oracle ADF Developer to join our team for an on-site deployment at our client’s location in Hyderabad. The ideal candidate should have a solid background in Java development, Oracle ADF, and associated tools and technologies, strong problem-solving abilities, and experience working in a Linux-based environment. Key Responsibilities Develop and maintain enterprise-grade applications using Oracle ADF and Java 7/8 . Design and implement reports using Jasper Reports and iReport . Manage deployments and configurations on the JBoss application server. Work with development tools such as NetBeans , Eclipse , or JDeveloper . Perform data management tasks using MySQL . Write and maintain Shell scripts and configure cron jobs for scheduled tasks. Administer and monitor systems in a Linux environment. Utilize Apache Superset for data visualization and dashboard reporting. Collaborate with cross-functional teams to deliver high-quality solutions on time. Troubleshoot issues and provide timely resolutions. Required Skills Proficiency in Java 7/8 and object-oriented programming Strong hands-on experience with Oracle ADF Expertise in Jasper Reports , iReport , and report generation Experience with JBoss server setup and application deployment Familiarity with NetBeans , Eclipse , or JDeveloper IDEs Good understanding of MySQL database design and queries Experience with Linux OS and shell scripting Ability to set up and manage cron jobs Knowledge of Apache Superset or similar BI tools Strong problem-solving and debugging skills Good to Have Exposure to Agile development practices Familiarity with REST APIs and web services Knowledge of version control tools (e.g., Git) Education Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. What we offer you: · Existing clients in multiple domains to work. · Strong and efficient team committed to quality output. · Enhance your knowledge and gain industry domain expertise by working in varied roles. · A team of experienced, fun, and collaborative colleagues · Hybrid work arrangement for flexibility and work-life balance (If the client/project allows) · Competitive base salary and job satisfaction. Join our team and become part of an exciting company where your expertise and ideas are valued, and where you can make a significant impact in the IT industry. Apply today! Interested applicants, please submit your detailed resume stating your current and expected compensation and notice period to srahaman@kellton.com Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Title - Azure Data Engineer Experience - 5-8 Years Location - Pune and Gurgaon (Hybrid) Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Responsibilities Create and manage scalable data pipelines to collect, process, and store large volumes of data from various sources Integrate data from multiple sources, ensuring consistency, quality, and reliability Design, implement, and optimize database schemas and structures to support data storage and retrieval Develop and maintain ETL (Extract, Transform, Load) processes to accurately and efficiently move data between systems Build and maintain data warehouses to support business intelligence and analytics needs Optimize data processing and storage performance for efficient resource utilization and quick retrieval Create and maintain comprehensive documentation for data pipelines, ETL processes, and database schemas Monitor data pipelines and systems for performance and reliability, troubleshooting and resolving issues as they arise Stay up to date with emerging technologies and best practices in data engineering, evaluating and recommending new tools as appropriate Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a related field (Engineering or Math preferred) 5+ years of experience with SQL, Python, .NET, SSIS, and SSAS 2+ years of experience with Azure cloud services, particularly SQL Server, ADF, Azure Databricks, ADLS, Key Vault, Azure Functions, and Logic Apps, with an emphasis on Databricks 2+ years of experience using Git and deploying code using a CI/CD approach Strong analytical and problem-solving skills Excellent communication and interpersonal skills Ability to work independently and as part of a team Attention to detail and a commitment to quality

Posted 1 month ago

Apply

7.0 years

0 Lacs

Mohali district, India

On-site

Job Summary: We are seeking a skilled Database & ETL Developer with strong expertise in SQL, data modeling, cloud integration, and reporting tools. The ideal candidate will be responsible for designing scalable database architectures, optimizing complex queries, working with modern ETL tools, and delivering insightful dashboards and reports. This role requires collaboration across multiple teams and the ability to manage tasks in a dynamic, agile environment. Experience: 7+ Years Location: Mohali (Work from Office Only) Key Responsibilities: Design and implement normalized and scalable database schemas using ER modeling techniques. Develop, maintain, and optimize stored procedures, triggers, views, and advanced SQL queries (e.g., joins, subqueries, indexing). Execute database backup, restore, and recovery operations ensuring data integrity and high availability. Optimize SQL performance through indexing strategies, execution plans, and query refactoring. Lead and support cloud migration and integration projects involving platforms such as AWS and Azure. Implement and manage data lake architectures such as AWS HealthLake and AWS Glue pipelines. Create and manage interactive dashboards and business reports using Power BI, Amazon QuickSight, or Tableau. Collaborate with cross-functional teams and use tools like JIRA, Azure Boards, ClickUp, or Trello for task and project tracking. Required Skills: Strong experience with SQL Server or PostgreSQL , including advanced T-SQL programming. In-depth knowledge of ER modeling and relational database design principles. Proficiency in query optimization , indexing, joins, and subqueries. Hands-on experience with modern ETL and data integration platforms such as Airbyte, Apache Airflow, Azure Data Factory (ADF), and AWS Glue . Understanding of Data Lake / Health Lake architectures and their role in cloud data ecosystems. Proficiency with reporting tools like Power BI, Amazon QuickSight, or Tableau . Experience with database backup, restore , and high availability strategies . Familiarity with project/task tracking tools such as JIRA, Azure Boards, ClickUp, or Trello . Soft Skills: Strong verbal and written communication skills. Excellent problem-solving and troubleshooting abilities. Self-motivated with the ability to manage priorities and work independently across multiple projects. Nice to Have: Certification in cloud platforms (AWS, Azure). Exposure to healthcare data standards and compliance (e.g., HIPAA, FHIR). Company overview: smartData is a leader in global software business space when it comes to business consulting and technology integrations making business easier, accessible, secure and meaningful for its target segment of startups to small & medium enterprises. As your technology partner, we provide both domain and technology consulting and our inhouse products and our unique productized service approach helps us to act as business integrators saving substantial time to market for our esteemed customers. With 8000+ projects, vast experience of 20+ years, backed by offices in the US, Australia, and India, providing next door assistance and round-the-clock connectivity, we ensure continual business growth for all our customers. Our business consulting and integrator services via software solutions focus on important industries of healthcare, B2B, B2C, & B2B2C platforms, online delivery services, video platform services, and IT services. Strong expertise in Microsoft, LAMP stack, MEAN/MERN stack with mobility first approach via native (iOS, Android, Tizen) or hybrid (React Native, Flutter, Ionic, Cordova, PhoneGap) mobility stack mixed with AI & ML help us to deliver on the ongoing needs of customers continuously. For more information, visit http://www.smartdatainc.com Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position: Data Engineer - Azure Synapse Location: Mumbai/Pune/Nagpur Type of Employment: TPC Key Result Areas and Activities: • Design, develop and deploy ETL/ELT solutions on premise or in the cloud • Transformation of data with stored procedures • Report Development (MicroStrategy/Power BI) • Create and maintain comprehensive documentation for data pipelines, configurations, and processes • Ensure data quality and integrity through effective data management practices • Monitor and optimize data pipeline performance • Troubleshoot and resolve data-related issues Technical Experience: Must Have • Good experience in Azure Synapse • Good experience in ADF • Good experience in Snowflake & Stored Procedures • Experience with ETL/ELT processes, data warehousing, and data modelling • Experience with data quality frameworks, monitoring tools, and job scheduling • Knowledge of data formats like JSON, XML, CSV, and Parquet • English Fluent (Strong written, verbal, and presentation skills) • Agile methodology & tools like JIRA • Good communication and formal skills Good To Have • Good experience in MicroStrategy and PowerBI • Experience in scripting languages such as Python, Java, or Shell scripting • Familiarity with Azure cloud platforms and cloud data services Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Governance & MDM Specialist – Azure Purview and Profisee (POC Implementation) Job Overview: We are seeking a skilled Data Governance and Master Data Management (MDM) Specialist to lead the setup and validation of a POC environment using Azure Purview and Profisee . The goal is to establish an integrated framework that ensures high standards in data quality, lineage tracking, security, and master data management across source systems. Key Responsibilities: Set up and configure Azure Purview based on the defined architecture Deploy and configure Profisee MDM platform Provision necessary Azure resources , including storage, compute, and access controls Configure: Data glossary Data classifications Domains and metadata schemas Security and access policies Set up data lineage tracking across integrated systems Define and implement match/merge rules , workflow processes , and data quality logic in Profisee Integrate Azure Purview and Profisee with multiple source systems for the POC Build data ingestion and transformation pipelines using tools such as Azure Data Factory (ADF) Ensure accurate lineage visualization , data quality validation , and matching logic verification Provide support for orchestration , testing, and ongoing validation during the POC Required Skills & Experience: Hands-on experience with Azure Purview configuration and integration Strong expertise in Profisee or similar MDM platforms Experience with data cataloging , metadata management , and data lineage Familiarity with data governance frameworks and data stewardship Proficient in Azure Data Factory (ADF) or other data pipeline tools Good understanding of Azure cloud architecture , RBAC, and resource provisioning Strong SQL skills for data profiling and validation Experience with match/merge logic , workflow configuration, and business rules in MDM tools Ability to troubleshoot and resolve integration issues across systems Nice to Have: Familiarity with Databricks , Azure Functions , or Logic Apps Knowledge of Infrastructure as Code (IaC) – ARM, Bicep, or Terraform Prior experience with API-based integrations for real-time data sync Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Ganganagar, Rajasthan, India

On-site

33471BR Pune Job Description Mandatory Skills 1 Core Java 8 2 Spring Core, Spring Boot, Spring MVC, Spring Rest 3 JPA, Spring JDBC template OR Hibernate 4 REST Webservices 5 Security (Basic, JWT, OAuth, API Key) 6 Angular 7 Oracle Database, SQL queries 8 Jenkins 9 Agile Good to have skills 1 Spring Batch 2 JSP 3 Oracle ADF 4 PostgreSQL 5 AWS Developer Basics 6 SQL Tuning, PLSQL Additional Requirements 1 AWS Cloud Developer skills knowhow 2 Troubleshooting Production and Performance issues Qualifications Engineering Range of Year Experience-Min Year 9 Range of Year Experience-Max Year 14 Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Ganganagar, Rajasthan, India

On-site

33472BR Pune Job Description Mandatory Skills 1 Core Java 8 2 Spring Core, Spring Boot, Spring MVC, Spring Rest 3 JPA, Spring JDBC template OR Hibernate 4 REST Webservices 5 Security (Basic, JWT, OAuth, API Key) 6 Angular 7 Oracle Database, SQL queries 8 Jenkins 9 Agile Good to have skills 1 Spring Batch 2 JSP 3 Oracle ADF 4 PostgreSQL 5 AWS Developer Basics 6 SQL Tuning, PLSQL Additional Requirements 1 AWS Cloud Developer skills knowhow 2 Troubleshooting Production and Performance Qualifications Engineering Range of Year Experience-Min Year 9 Range of Year Experience-Max Year 14 Show more Show less

Posted 1 month ago

Apply

6.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Designation – Sr.Consultant Experience- 6 to 7 years Location- Bengaluru Skills Req- Python, SQL, Databrciks , ADF ,within-Databrcisk - DLT, PySpark, Structural streaming , performance and cost optimization. Roles and Responsibilities: Capture business problems, value drivers, and functional/non-functional requirements and translate into functionality. Assess the risks, feasibility, opportunities, and business impact. Assess and model processes, data flows, and technology to understand the current value and issues, and identify opportunities for improvement. Create / update clear documentation of requirements to align with the solution over the project lifecycle. Ensure traceability of requirements from business needs through testing and scope changes, to final solution. Interact with software suppliers, designers and developers to understand software limitations, deliver elements of system and database design, and ensure that business requirements and use cases are handled. Configure and document software and processes, using agreed standards and tools. Create acceptance criteria and validate that solutions meet business needs, through defining and coordinating testing. Create and present compelling business cases to justify solution value and establish approval, funding and prioritization. Initiate, plan, execute, monitor, and control Business Analysis activities on projects within agreed parameters of cost, time and quality. Lead stakeholder management activities and large design sessions. Lead teams to complete business analysis on projects. Configure and document software and processes. Define and coordinate testing. Mandatory skills: Agile project experience. Understand Agile frameworks and tools. Worked in Agile. Educated stakeholders including Product Owners and Business partners in Agile ways of working. Understand systems engineering concepts, data/process analysis and modeling, products & solutions. Degree. 4 - 7 yrs IT. Optional skills: Agile certifications/trainings preferred. CBAP (Certified Business Analysis Professional) or PMI-PBA certification preferred. Lean Practitioner training and experience are an asset. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

India

Remote

Job Title: Senior Data Engineer – Azure & API Development Location: Remote Experience Required: 7+ Years Job Summary: We are looking for an experienced Data Engineer with strong expertise in Azure cloud architecture , API development , and modern data engineering tools . The ideal candidate will have in-depth experience in building and maintaining scalable data pipelines and API integrations using Azure services like Azure Data Factory (ADF) , Databricks , Azure Functions , and Service Bus , along with infrastructure provisioning using Terraform . Key Responsibilities: Design and implement scalable, secure, and high-performance data solutions on Azure . Develop, deploy, and manage RESTful APIs to support data access and integration. Build and maintain ETL/ELT data pipelines using Azure Data Factory , Databricks , and Azure Functions . Integrate data workflows with Azure Service Bus and other messaging services. Define and implement cloud infrastructure using Terraform and Infrastructure-as-Code (IaC) best practices. Collaborate with stakeholders to understand data requirements and develop technical solutions. Ensure best practices for data governance , security , monitoring , and performance optimization . Work closely with DevOps and Data Architects to implement CI/CD pipelines and production-grade deployments. Must-Have Skills: 7+ years of professional experience in Data Engineering or related roles. Strong hands-on experience with Azure services , particularly: Azure Data Factory (ADF) Databricks (Spark-based processing) Azure Functions Azure Service Bus Proficient in API development (RESTful APIs using Python, .NET, or Node.js). Good command over SQL , Spark SQL , and data transformation techniques. Experience with Terraform for IaC and provisioning Azure resources. Excellent understanding of data architecture , cloud security , and governance models . Strong problem-solving skills and experience working in Agile environments. Preferred Skills: Familiarity with CI/CD tools like Azure DevOps, GitHub Actions, or Jenkins. Exposure to event-driven architecture and real-time data streaming. Knowledge of containerization (Docker/Kubernetes) is a plus. Experience in performance tuning and cost optimization in Azure environments. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Required Skills: YOE-8+ Mode Of work: Remote Design, develop, modify, and test software applications for the healthcare industry in agile environment. Duties include: Develop. support/maintain and deploy software to support a variety of business needs Provide technical leadership in the design, development, testing, deployment and maintenance of software solutions Design and implement platform and application security for applications Perform advanced query analysis and performance troubleshooting Coordinate with senior-level stakeholders to ensure the development of innovative software solutions to complex technical and creative issues Re-design software applications to improve maintenance cost, testing functionality, platform independence and performance Manage user stories and project commitments in an agile framework to rapidly deliver value to customers deploy and operate software solutions using DevOps model. Required skills: Azure Deltalake, ADF, Databricks, PySpark, Oozie, Airflow, Big Data technologies( HBASE, HIVE), CI/CD (GitHub/Jenkins) Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The role involves building and managing data pipelines, troubleshooting issues, and ensuring data accuracy across various platforms such as Azure Synapse Analytics, Azure Data Lake Gen2, and SQL environments. This position requires extensive SQL experience and a strong background in PySpark development. Responsibilities Data Engineering: Work with Azure Synapse Pipelines and PySpark for data transformation and pipeline management. Perform data integration and schema updates in Delta Lake environments, ensuring smooth data flow and accurate reporting. Work with our Azure DevOps team on CI/CD processes for deployment of Infrastructure as Code (IaC) and Workspace artifacts. Develop custom solutions for our customers defined by our Data Architect and assist in improving our data solution patterns over time. Documentation : Document ticket resolutions, testing protocols, and data validation processes. Collaborate with other stakeholders to provide specifications and quotations for enhancements requested by customers. Ticket Management: Monitor the Jira ticket queue and respond to tickets as they are raised. Understand ticket issues, utilizing extensive SQL, Synapse Analytics, and other tools to troubleshoot them. Communicate effectively with customer users who raised the tickets and collaborate with other teams (e.g., FinOps, Databricks) as needed to resolve issues. Troubleshooting and Support: Handle issues related to ETL pipeline failures, Delta Lake processing, or data inconsistencies in Synapse Analytics. Provide prompt resolution to data pipeline and validation issues, ensuring data integrity and performance. Desired Skills & Requirements Seeking a candidate with 5+ years of Dynamics 365 ecosystem experience with a strong PySpark development background. While various profiles may apply, we highly value a strong person-organization fit. Our ideal candidate possesses the following attributes and qualifications: Extensive experience with SQL, including query writing and troubleshooting in Azure SQL, Synapse Analytics, and Delta Lake environments. Strong understanding and experience in implementing and supporting ETL processes, Data Lakes, and data engineering solutions. Proficiency in using Azure Synapse Analytics, including workspace management, pipeline creation, and data flow management. Hands-on experience with PySpark for data processing and automation. Ability to use VPNs, MFA, RDP, jump boxes/jump hosts, etc., to operate within the customers secure environments. Some experience with Azure DevOps CI/CD IaC and release pipelines. Ability to communicate effectively both verbally and in writing, with strong problem-solving and analytical skills. Understanding of the operation and underlying data structure of D365 Finance and Operations, Business Central, and Customer Engagement. Experience with Data Engineering in Microsoft Fabric Experience with Delta Lake and Azure data engineering concepts (e.g., ADLS, ADF, Synapse, AAD, Databricks). Certifications in Azure Data Engineering. Why Join Us? Opportunity to work with innovative technologies in a dynamic environment where progressive work culture with a global perspective where your ideas truly matter, and growth opportunities are endless. Work with the latest Microsoft Technologies alongside Dynamics professionals committed to driving customer success. Enjoy the flexibility to work from anywhere Work-life balance that suits your lifestyle. Competitive salary and comprehensive benefits package. Career growth and professional development opportunities. A collaborative and inclusive work culture. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

India

On-site

Job Title: Azure Databricks Engineer Experience: 4+ Years Required Skills: 4+ years of experience in Data Engineering . Strong hands-on experience with Azure Databricks and PySpark . Good understanding of Azure Data Factory (ADF) , Azure Data Lake (ADLS) , and Azure Synapse . Strong SQL skills and experience with large-scale data processing. Experience with version control systems (Git), CI/CD pipelines, and Agile methodology. Knowledge of Delta Lake, Lakehouse architecture, and distributed computing concepts. Preferred Skills: Experience with Airflow , Power BI , or machine learning pipelines . Familiarity with DevOps tools for automation and deployment in Azure. Azure certifications (e.g., DP-203) are a plus. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra

On-site

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10358155 Date posted 06/16/2025 End Date 06/24/2025 City Pune State/Region Maharashtra Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Software Development Engineering What does a great Software Development Engineer do? As Software Development Engineer your focus will be on applying the principles of engineering to software development. The role focuses on the complex and large software systems that make up the core systems for the organization. You will be responsible for developing, unit testing, and integration tasks working within this highly visible-client focused web services application. Development efforts will also include feature enhancements, client implementations, and bug fixes as well as support of the production environment. What you will do: Collaborate within a team environment in the development, testing, and support of software development project lifecycles. Develop web interfaces and underlying business logic. Prepare any necessary technical documentation. Track and report daily and weekly activities. Participate in code reviews and code remediation. Perform and develop proper unit tests and automation. Participate in a 24 hour on-call rotation to support previous releases of the product. Research problems discovered by QA or product support and develop solutions to the problems. Perform additional duties as determined by business needs and as directed by management. What you will need to have: Bachelor’s degree in Computer Science, Engineering or Information Technology, or equivalent experience. 5+ years of experience in developing scalable and secured J2EE applications. Excellent knowledge in Java based technologies (Core Java, JSP, AJAX, JSF, EJB, and Spring Framework), Oracle SQL/PLSQL and App servers like WebLogic, JBOSS. Excellent knowledge in SOAP & REST web service implementations. Knowledge in UNIX environment is preferred. Experience in JSF UI components (Oracle ADF & Rich Faces) technology is preferred. Good analytical, organizational, and problem-solving abilities. Good at prioritizing the tasks and commitment to complete them. Strong team player / customer service orientation. Demonstrated ability to work with both end users and technical staff. Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and Works well in a fast paced, fluid and dynamic environment. What would be great to have: Experience working in a Scrum Development Team Banking and Financial Services experience Java Certifications. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles & Responsibilities Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements Hands-on experience with Microsoft Fabric, including Lakehouse, Data Factory, and Synapse. Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Let’s grow together. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies