Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 - 2.0 years
0 Lacs
Chandigarh, Chandigarh
Remote
Additional Locations India-Bengaluru; India-Bengaluru-Remote; India-Chandigarh; India-Chandigarh-Remote; India-Hyderabad-Remote; India-Mohali; India-Mohali-Remote; India-Mumbai; India-Mumbai-Remote Job ID R0000030606 Category IT ABOUT THIS ROLE Key Accountabilities : Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting. If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders. Excellent grasp of and expertise with test-driven development and continuous integration processes. Analysis and Design – Converts high-level design to low-level design and implements it. Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans. Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle. Benchmark application code proactively to prevent performance and scalability concerns. Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management. Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments. Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple components. Knowledge and Experience : Understanding of design concepts and architectural basics. Knowledge of performance engineering. Understanding of quality processes and estimate methods. Fundamental grasp of the project domain. The ability to transform functional and nonfunctional needs into system requirements. The ability to develop and code complicated applications is required. The ability to create test cases and scenarios based on specifications. Solid knowledge of SDLC and agile techniques. Knowledge of current technology and trends. Logical thinking and problem-solving abilities, as well as the capacity to collaborate. Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO. Sought: Denodo, SQL, Python, PowerBI, CDC Tools (preferably Striim). General Knowledge: PowerApps, Java, DataIKU. 6-10 years of experience in software development with minimum 2 years of cloud computing. Education : Bachelor of Science in Computer Science, Engineering, or related technical field.
Posted 1 month ago
5 - 8 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D And A) – Azure Data Engineer - Senior As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud servicesETL design, development and migration of existing on-prem ETL routines to Cloud ServiceInteract with senior leaders, understand their business goals, contribute to the delivery of the workstreamsDesign and optimize model codes for faster execution Skills And Attributes For Success Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF VersionHands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse)Should have project experience in Azure Data Lake / Blob (Storage purpose)Should have basic understanding on Batch Account configuration, various control optionsSound knowledge in Data Bricks & Logic AppsShould be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 3-7 years of industry experienceHave working experience in an Agile base delivery methodology (Preferable)Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.Excellent communicator (written and verbal formal and informal).Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues aroundOpportunities to develop new skills and progress your careerThe freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 - 8 years
0 Lacs
Trivandrum, Kerala, India
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc with coding skills in Python PySpark and SQL. Works independently and demonstrates proficiency in at least one domain related to data with a solid understanding of SCD concepts and data warehousing principles. Outcomes Collaborate closely with data analysts data scientists and other stakeholders to ensure data accessibility quality and security across various data sources.rnDesign develop and maintain data pipelines that collect process and transform large volumes of data from various sources.Implement ETL (Extract Transform Load) processes to facilitate efficient data movement and transformation.Integrate data from multiple sources including databases APIs cloud services and third-party data providers.Establish data quality checks and validation procedures to ensure data accuracy completeness and consistency.Develop and manage data storage solutions including relational databases NoSQL databases and data lakes.Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Measures Of Outcomes Adherence to engineering processes and standardsAdherence to schedule / timelinesAdhere to SLAs where applicable# of defects post delivery# of non-compliance issuesReduction of reoccurrence of known defectsQuickly turnaround production bugsCompletion of applicable technical/domain certificationsCompletion of all mandatory training requirementstEfficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).Average time to detect respond to and resolve pipeline failures or data issues. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Documentation Create documentation for personal work and review deliverable documents including source-target mappings test cases and results. Configuration Follow configuration processes diligently. Testing Create and conduct unit tests for data pipelines and transformations to ensure data quality and correctness.Validate the accuracy and performance of data processes. Domain Relevance Develop features and components with a solid understanding of the business problems being addressed for the client.Understand data schemas in relation to domain-specific contexts such as EDI formats. Defect Management Raise fix and retest defects in accordance with project standards. Estimation Estimate time effort and resource dependencies for personal work. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Design Understanding Understand design and low-level design (LLD) and link it to requirements and user stories. Certifications Obtain relevant technology certifications to enhance skills and knowledge. Skill Examples Proficiency in SQL Python or other programming languages utilized for data manipulation.Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).Conduct tests on data pipelines and evaluate results against data quality and performance specifications.Experience in performance tuning data processes.Proficiency in querying data warehouses. Knowledge Examples Knowledge Examples Knowledge of various ETL services provided by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow and Azure ADF/ADLF.Understanding of data warehousing principles and practices.Proficiency in SQL for analytics including windowing functions.Familiarity with data schemas and models.Understanding of domain-related data and its implications. Additional Comments Design, develop, and maintain data pipelines and architectures using Azure services. Collaborate with data scientists and analysts to meet data needs. Optimize data systems for performance and reliability. Monitor and troubleshoot data storage and processing issues. Responsibilities Design, develop, and maintain data pipelines and architectures using Azure services. Collaborate with data scientists and analysts to meet data needs. Optimize data systems for performance and reliability. Monitor and troubleshoot data storage and processing issues. Ensure data security and compliance with company policies. Document data solutions and architecture for future reference. Stay updated with Azure data engineering best practices and tools. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 3+ years of experience in data engineering. Proficiency in Azure Data Factory, Azure SQL Database, and Azure Databricks. Experience with data modeling and ETL processes. Strong understanding of database management and data warehousing concepts. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Skills Azure Data Factory Azure SQL Database Azure Databricks ETL Data Modeling SQL Python Big Data Technologies Data Warehousing Azure DevOps Skills Azure,Aws,Aws Cloud,Azure Cloud
Posted 2 months ago
0.0 - 1.0 years
0 Lacs
Koraput, Orissa
On-site
Project Manager – Scanning & Digitization Domain 1. Domain Expertise & Workflow Management: * Develop a thorough understanding of the scanning and digitization domain, including industry best practices, standards, and evolving technologies. * Familiarity with different workflows in scanning and digitization projects, including image enhancement, OCR, indexing, metadata tagging, and final archival. * Knowledge of various types of scanners (flatbed, overhead, ADF, planetary, microfilm, book scanners, etc.) and their applications for different document types and conditions. * Expertise in handling documents of varying sizes (A0, A1, A2, A3, A4, B3, etc.), from fragile historical records to modern paperwork. * Understanding of Document Management Systems (DMS) and integration of digitized data into such systems for retrieval and security. 2. End-to-End Project Management: * Interpret the project scope, technical requirements, and deliverables in alignment with customer expectations and contract specifications. * Manage internal teams, ensuring proper resource allocation, workload distribution, and adherence to project timelines. * Oversee vendor management, including vendor selection, contract negotiation, and performance monitoring when outsourcing scanning and digitization work. * Maintain strong customer relationships through regular communication, addressing concerns, and ensuring client satisfaction throughout the project lifecycle. 3. Planning, Execution & Profitability Management: * Develop a site-wise project execution plan, including daily productivity targets and efficiency benchmarks. * Monitor daily progress and ensure alignment with project timelines, expected output, and defined service level agreements (SLAs). * Conduct regular financial analysis, tracking costs, operational efficiency, and revenue generation against project estimates to maintain profitability. * Identify areas for process optimization to enhance productivity and cost-effectiveness. 4. Daily Reporting & Communication: * Prepare and submit daily, weekly, and monthly progress reports to both management and customers. * Ensure transparency in reporting, covering key performance indicators, challenges faced, and solutions implemented. * Conduct periodic project review meetings with stakeholders to evaluate performance and necessary improvements. 5. Site Challenges & Troubleshooting: * Proactively identify, analyze, and resolve on-site challenges, including administrative, technical, and manpower-related issues. * Address scanner breakdowns, system failures, and other operational bottlenecks with a structured problem-solving approach. * Coordinate with IT and technical teams for quick resolution of software/hardware-related challenges in scanning and digitization. * Ensure adherence to data security and compliance guidelines, mitigating risks related to sensitive document handling. 6. Compliance & Quality Assurance: * Ensure strict adherence to project quality benchmarks and data accuracy standards. * Implement quality control measures to prevent errors in document indexing, scanning resolution, and file formatting. * Conduct periodic audits and review sessions to uphold project compliance and regulatory requirements. 7. Team Leadership & Development: * Lead and mentor the project team, fostering a culture of accountability, efficiency, and continuous learning. * Conduct training sessions for team members on scanning techniques, software usage, and workflow optimization. * Motivate the team to achieve high performance and maintain a collaborative work environment. Job Types: Full-time, Permanent Pay: From ₹30,000.00 per month Schedule: Day shift Supplemental Pay: Performance bonus Ability to commute/relocate: Koraput, Orissa: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred) Experience: total work: 1 year (Preferred) Customer relationship management: 1 year (Preferred) Crisis management: 1 year (Preferred) Problem management: 1 year (Preferred) Project management: 1 year (Preferred) Language: Hindi (Preferred) Work Location: In person Application Deadline: 10/04/2025 Expected Start Date: 14/04/2025
Posted 2 months ago
5 - 10 years
5 - 12 Lacs
Faridabad, Delhi, Gurgaon
Work from Office
Azure Cloud Engineer Skill-Azure Cloud, AFD, ADL, ADF, ADB, Terraform,Active Directory,Sql Server Database Administration,Data Warehousing Exp-5+YRS PKG UPTO-14LPA Loc-Gurgaon NP-Immediate-30Days Ritika-8587970773 ritikab.imaginators@gmail.com Required Candidate profile Azure Cloud Engineer, Terraform,Azure Data Lake, Azure DataBricks,Azure Data Factory, Active Directory, Sql Server Database Administration,Data Warehousing,Managing PipelineETL Networking,Cloud Flare
Posted 2 months ago
4 - 8 years
7 - 17 Lacs
Pune, Hyderabad
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Perks and benefits: Mention available facilities and benefits the company is offering with this job.
Posted 2 months ago
4 - 9 years
5 - 10 Lacs
Chennai, Pune, Bengaluru
Hybrid
Azure Devops 4+Yrs Bangalore/Chennai/Pune Skills: Azure devops, Terraform ,Kubernetes,Ci/CD, ADF/ADB,Infrastructure Cloud Excellent Communication skills Share cv to Yogitha@ontimesolutions.in or 7406353337
Posted 2 months ago
3 - 6 years
12 - 22 Lacs
Bengaluru, Hyderabad, Gurgaon
Hybrid
Work locations: Hyderabad/ Bangalore/ Mumbai/ Pune/ Gurgaon/ Kolkata/ Chennai Work you'll do The key job responsibilities will be to: Enhancements and Defect Management in Oracle IAM Identity & Access Governance including Role based access control, access request and certification, attestation process. Managing projects through the full system development lifecycle Preparing deployment documents and participate in the deployment activities. Attend Daily scrum call and engage with multiple stakeholders to finalize the requirement gathering. Maintain version control of the code changes. Required Skills Experience in development in OIM . Experience in Java Development / ADF experience. Hands-on experience with Oracle Identity Manager , including Customer Connect Development . Approval workflows , SOA composite , customer scheduled tasks , event handler and reconciliation . Experience on leveraging API or webservices . Experience in using Oracle PL/SQL Experience UNIX and WebLogic . Excellent debugging and troubleshooting skills. Multitask and switch gears to meet changing priorities and tasks to accomplish goals/objectives. Excellent verbal and written communication skills
Posted 2 months ago
6 - 11 years
8 - 13 Lacs
Bengaluru
Work from Office
Data Engineer with ADF ADB (MRF00825) - J48819 ROLE: Data Engineer Location: Anywhere in India-Work from home All the below skills/exp are mandatory Total exp: Exp in/as Data Engineer: Exp in Azure Data Factory: Exp in Azure Data Bricks: Exp in PowerBI: Exp in PySpark: Exp in Python: Required Candidate profile Candidate Experience Should Be : 6 To 15 Candidate Degree Should Be : BE-Comp/IT,BE-Other,BTech-Comp/IT,BTech-Other,MCA,MCS,ME-Comp/IT,ME-Other,MIS,MIT,MSc-Comp/IT,MS-Comp/IT,MSc-Other,MS-Other,MTech-Comp/IT,MTech-Other
Posted 2 months ago
7 - 12 years
15 - 30 Lacs
Chennai, Bengaluru, Kochi
Work from Office
Azure Data Engineer (Technical lead) JD Designation- Technical Leader Experience - 10-12 years Location - Chennai, Kochi and Bangalore Responsibilities: l Design, develop, and maintain robust data pipelines and ETL processes. l Implement and optimize data storage solutions in data warehouses and data lakes. l Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions. l Utilize Microsoft Azure tools for data integration, transformation, and analysis. l Develop and maintain reports and dashboards using Power BI and other analytics tools. l Ensure data integrity, consistency, and security across all data systems. l Optimize database and query performance to support data-driven decision-making. Qualifications: l 10-12 years of professional experience in data engineering or a related field. l Profound expertise in SQL, T-SQL, database design, and data warehousing principles. l Strong experience with Microsoft Azure tools, including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake. l Proficiency in Python, PySpark, and PySQL for data processing and analytics tasks. l Experience with Power BI and other reporting and analytics tools. l Demonstrated knowledge of OLAP, data warehouse design concepts, and performance optimizations in database and query processing. l Excellent problem-solving, analytical, and communication skills.
Posted 2 months ago
5 - 10 years
13 - 23 Lacs
Pune, Bengaluru, Hyderabad
Work from Office
5+ yrs Azure data Engineer PySpark, Java, SQL expert. Data modeling, ingestion, DW. Azure: ADF, Data bricks, ADLS. Hadoop, Hive, Cloudera. Data bricks project experience. Strong design, leadership skills. Agile, Git, ML con. Degree in CS/Engineering.
Posted 2 months ago
4 - 6 years
7 - 9 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Skills: SQL, ADF,ETL, Databricks,Pyspark,GENAI,Azure AI Cognitive, Data EngineeringNotice Period: 0-30 days Required Candidate profile LocationBangalore, Chennai, Hyderabad, Pune, Mumbai, Noida, Gurgaon
Posted 2 months ago
10 - 20 years
17 - 32 Lacs
Bengaluru
Remote
Location - Remote Data Architect Job Description: Responsibilities: Extensive Experience : Over 12 to 15 years in data architecture, data modelling, and database engineering, with expertise in OLAP/OLTP design, data warehouse solutions, and ELT/ETL processes. Proficient in Tools and Technologies : 5+ years of experience in Architect, Design and Develop Microsoft Azure solutions (Data Factory, Synapse, SQL DB, Cosmos etc), Azure Databricks, Snowflake features (Streams, Data sharing), and enterprise modelling tools (Erwin, Power Designer) Experience in financial services, insurance, or banking industries is a plus . Data Architecture & Modelling: Design and implement enterprise-level data architecture solutions across OLTP and OLAP and Snowflake. Design and implement effective data models using Snowflake, including Star Schema, Snowflake Schema, and Data Vault methodologies. Create and maintain logical and physical data models that align with business requirements and Snowflake's best practices. Knowledge of Slowly Changing Dimensions (SCD Type I & II) in data warehouse projects. Data Migration & Transformation: Lead end-to-end data migration projects from legacy systems to cloud-based environments (Microsoft Azure, Snowflake). Design staging environments and data integration frameworks . Work closely with ETL developers to ensure that the data model is seamlessly integrated with data pipelines, facilitating accurate and efficient data flow. Exposure to dbt ETL tooling. Technology & Performance Optimization: Optimize database performance by implementing indexing strategies, partitioning, query tuning, and workload balancing. Utilize cloud-based data platforms (Azure), and data cataloguing solutions. Monitor and optimize vended Snowflake performance, including query optimization, resource management, and cost control Stakeholder Engagement & Leadership: Collaborate with data engineers, business analysts, and data scientists to ensure data models meet reporting and analytical needs Drive technical roadmap initiatives and ensure alignment with organizational goals. Mentor junior architects and engineers in data modelling, database optimization, and governance best practices. Required Skills & Experience: 12+ years of experience in Enterprise-level Data Architecture, Data Modelling, and Database Engineering. Expertise in OLAP & OLTP design, Data Warehouse solutions, ELT/ETL processes. Strong verbal and written communication skills for collaborating with both technical teams and business stakeholders. Proficiency in data modelling concepts and practices such as normalization, denormalization, and dimensional modelling (Star Schema, Snowflake Schema, Data Vault, Medallion Data Lake). Experience with Snowflake-specific features, including clustering, partitioning, and schema design best practices Proficiency in Enterprise Modelling tools - Erwin, PowerDesigner, IBM Infosphere etc Strong experience in Microsoft Azure data pipelines (Data Factory, Synapse, SQL DB, Cosmos DB, Data bricks). Familiarity with Snowflakes native tools and services, including Snowflake Data Sharing, Snowflake Streams & Tasks, and Snowflake Secure Data Sharing. Strong knowledge of SQL performance tuning, query optimization, and indexing strategies. Working knowledge of BIAN, ACORD, ESG risk data integration. Experience in financial services, insurance, or banking industries is a plus. Preferred Certifications: Microsoft Azure Data Architect Certification Snowflake Cloud Database certifications TOGAF or equivalent enterprise architecture certification
Posted 2 months ago
3 - 4 years
4 - 6 Lacs
Noida
Work from Office
Position summary: A user shall work with the development team and responsible for development task as individual contribution .He/she should be technical sound and able to communicate with client perfectly . Key duties & responsibilities: Work as Specialist Data engineering project for E2E Analytics. Ensure Project delivery on time. Mentor other team mates and guide them. Will take the requirement from client and communicate as well. Ensure Timely documents creation for knowledge base, user guides, and other various communications systems. Ensures delivery against Business needs, team goals and objectives, i.e., meeting commitments and coordinating overall schedule. Works with large datasets in various formats, integrity/QA checks, and reconciliation for accounting systems. Leads efforts to troubleshoot and solve process or system related issues. Understand, support, enforce and comply with company policies, procedures and Standards of Business Ethics and Conduct. Experience working with Agile methodology Experience, Skills and Knowledge: Bachelors degree in computer science or equivalent experience is required. B.Tech/MCA preferable. Minimum 3 4 years experience. Excellent communications and strong commitment for delivering the highest level of service Technical Skills Expert knowledge and experience working with Spark, Scala Experience in Azure data Factory ,Azure Data bricks, data Lake Experience working with SQL and Snowflake Experience with data integration tools such as SSIS, ADF Experience with programming languages such as Python Expert in Astronomer Airflow. Experience with programming languages such as Python, Spark, Scala Experience or exposure on Microsoft Azure Data Fundamentals Key competency profile: Own your development by implementing and sharing your learnings Motivate each other to perform at our highest level Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Communicate effectively if there any challenge. Accountability and Responsibility should be there.
Posted 2 months ago
6 - 11 years
20 - 22 Lacs
Bengaluru
Work from Office
Designation : Azure Data Engineer Experience : 6+ Years Work Location : Bangalore (Infantry Road) - WFO Notice Period: Immediate Joiners Job Description: Mandatory Skill: SQL Server - administration and development Extensive data warehouse/data lake experience Azure SQL MI Azure Data Factory Azure Data Lake/ Databricks Good to Have StarQuest (Stelo) data replicator
Posted 2 months ago
14 - 24 years
30 - 45 Lacs
Bengaluru, Hyderabad, Noida
Hybrid
Dear candidate, We found your profile suitable for our current opening, please go through the below JD for better understanding of the role, Job Description : Role : Technical Architect / Senior Technical Architect Exp : 12 - 25 years Mode of work : Hybrid Model Work Location : Hyderabad/Bangalore/Noida/Pune/Kolkata Job Summary: We are seeking an experienced Azure Data Platform Architect to design and lead the development of modern, scalable, cloud-native data platforms leveraging Microsoft Azure services. The ideal candidate will have deep expertise in Azures data ecosystem, strong data architecture skills, and hands-on experience in SQL and data analysis . Knowledge of AI/ML and Generative AI (GenAI) concepts is a plus , but the core focus is on building robust data pipelines, data lakes, real-time streaming architectures, and analytics capabilities Key Responsibilities: Platform Architecture & Design Design and architect end-to-end data platforms on Microsoft Azure, balancing performance, scalability, and cost optimization. Lead the design of data ingestion pipelines supporting batch, real-time, and event-driven data processing. Define data modeling strategies for raw and curated datasets in Data Lake environments. Ensure architecture aligns with enterprise data governance, security, and compliance standards. Azure Data Platform Implementation Build and optimize data pipelines and data engineering workflows using:Azure Data Factory, Azure Databricks,Azure Data Lake Storage Gen2, Azure Synapse Analytics,Azure Stream Analytics,Azure IoT Hub / Telemetry Ingestion (if applicable) ,Azure Logic Apps / Function Apps Support event-driven processing and complex event stream analytics. Integrate with enterprise systems, APIs, and external data sources. Data Analysis, Governance, and Security Perform advanced SQL querying and data analysis to support business insights, model validation, and platform optimization. Implement robust data governance, data quality frameworks, and metadata management. Ensure secure data platform operations, including encryption, role-based access controls, and secrets management using Azure Key Vault . Set up monitoring, logging, and alerting for operational health and reliability. Collaboration & Leadership Act as the data architecture lead, working closely with engineering teams, data scientists, and business stakeholders. Provide best practices and technical leadership for data modeling, pipeline development, and analytics enablement. Mentor team members and guide project delivery with a focus on engineering excellence. Required Skills & Experience: 12+ years of experience in data engineering, architecture, or related fields. 5+ years of experience designing and implementing Azure-based data platforms. Hands-on expertise with core Azure services: Azure Data Factory, Databricks, Synapse, Data Lake Storage Gen2 Real-time streaming tools like Azure Stream Analytics Azure Logic Apps / Function Apps Strong SQL skills and experience performing data analysis directly on large datasets. Proficiency in Python, PySpark , or equivalent programming languages. Deep understanding of data modeling, data governance ,and cloud security best practices . Experience in CI/CD pipelines and DevOps practices for data platforms. Nice to Have: Familiarity with AI/ML pipelines and basic knowledge of Generative AI (GenAI) use cases such as: AI-driven summarization Natural language query over data Advanced predictive analytics Experience integrating Azure Machine Learning (Azure ML) or similar frameworks. Exposure to LLMs (Large Language Models) and modern GenAI platforms is a plus. Knowledge of data visualization tools like Power BI or equivalent. Please check below link for organisation details, https://www.tavant.com/ If interested , please drop your resume to dasari.gowri@tavant.com Regards Dasari Krishna Gowri Associate Manager - HR www.tavant.com
Posted 2 months ago
8 - 10 years
19 - 30 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Work from Office
Azure Databricks, Pyspark, Azure Data Factory
Posted 2 months ago
4 - 9 years
5 - 15 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Hi , Greetings from Enormous IT, We are hiring for Oracle Integration Cloud (OIC) with 4+ Years of Experience . Role : Oracle Integration Cloud ( OIC ) Experience : 5 + Years Work Mode : Hybrid Work Location : Any where in India Provide 8*5 support to OIC environment mainly the integration with Oracle Fusion GL Module Qualification 5 + years of relevant professional experience. Bachelor's degree in computer science, information systems, software engineering, or related field preferred 4 + years of related experience with Oracle Fusion Cloud Applications and Oracle PaaS Hands-on technical experience in developing and support PaaS extensions in VBCS, ADF, DBCS, OIC to Oracle Cloud. Hands on experience in developing payloads for all integration points to integrate with Oracle Fusion Design, develop and provide hands-on support of customizations using Visual Builder to Oracle Cloud Applications Design, develop and provide hands-on support of integrations in OIC using web services, FBDI, and data transformation from OTBI and BI Publisher Experience in Extending Oracle SaaS footprints with Oracle PaaS (Creating custom applications and integrating with Oracle Cloud Applications.) Proficient experience in OIC , Java SE , Java EE , Oracle ADF , SOAP , RESTful , XML , SQL , PL / SQL Hands on experience in integration with Oracle Fusion (Financial AR Order Management Procurement Projects – Inventory – GL) is a must Thanks & Regards, Komali Talent Acquisition Executive tat5@enormousit.com Enormous IT Services Pvt. Ltd. www.enormousit.com
Posted 2 months ago
7 - 12 years
0 - 3 Lacs
Hyderabad
Hybrid
Hands-on programming experience using ADLS, ADF, Azure Functions, Azure SQL • Must have exp on implementing CI/CD for ADLS, ADF, Azure Functions • Azure storage options - Azure Data Lake Storage, Azure Blob Storage, Azure VM • Azure data factory - ADF Components & Basic Terminology, Azure Portal Basics & ADF Creation, Basic Pipelines, Triggers, Copy Activity, Data Transformation, Components and Mappings with DataFlows, Linked Services, Datasets, Parametrization, Integration with Different Sources, Security, Debugging and Troubleshooting • Azure Git Configurations • Must have exp on either python/pyscript • Ability to read architecture diagrams, data models, data flows • Database design and development SQL , Stored Procedures, Performance tuning, DB design & optimization • Good to have Snowflake experience. • Experience using Azure DevOps, GitHub, Visual Studio or equivalent • Emphasis on code quality, integration testing, performance testing and tuning • Hands on developing and system validation and testing of data pipelines, flows and services • Development of technical system and process documentation • Analyzing data to effectively coordinate the installation of new or the modification of existing data • Managing the data pipelines through software development lifecycle. • Monitoring data process performance post deployment until transitioned to operations • Communicating key project data to team members and building cohesion among teams. • Developing and executing project plans. 7+ years 15 days max hybrid-hyderabad
Posted 2 months ago
16 - 26 years
40 - 60 Lacs
Agra
Work from Office
As a Synapse - Principal Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and an Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Collaborate with the clients to understand the overall requirements and create the robust, extensible architecture to meet the client/business requirements. Identify the right technical stack and tools that best meets the client requirements. Work with the client to define the scalable architecture. Design end to end solution along with data strategy, including standards, principles, data sources, storage, pipelines, data flow, and data security policies. Collaborate with data engineers, data scientists, and other stakeholders to execute the data strategy. Implement the Synapse best practices, data quality and data governance. Define the right data distribution/consumption pattern for downstream systems and consumers. Own end-to-end delivery of the project and design and develop reusable frameworks. Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. Support business proposals by providing solution approaches, detailed estimations, technical insights and best practices. Guidementor team members, and create the technical artifacts. Demonstrate thought leadership. Job Requirement A total of 16+ years of professional experience, including a minimum of 5 years of experience specifically in Architect roles focusing on Analytics solutions. Additionally, a minimum of 3+ years of experience working with Cloud platforms, demonstrating familiarity with Public Cloud architectures. Experience in implementing Modern Data Platforms/Data Warehousing solutions covering all major data solutioning aspects like Data integration, harmonization, standardization, modelling, governance, lineage, cataloguing, Data sharing and reporting Should have a decent understanding and working knowledge of the ADF, Logic Apps, Dedicated SQL pools, Serverless SQL pool & Spark Pools services of Azure Synapse Analytics focussed on optimization , workload management , availability, security, observability and cost management strategies Good hands-on experience writing procedures & scripts using T-SQL, Python. Good understanding of RDBMS systems, distributed computing on cloud with hands-on experience on Data Modelling. Experience in large scale migration from on-prem to Azure cloud. Good understanding of Microsoft Fabric Excellent understanding of Database and Datawarehouse concepts Experience in working with Azure DevOps Excellent communication & interpersonal skills
Posted 2 months ago
18 - 28 years
70 - 75 Lacs
Noida
Work from Office
Shift Timing: 1 pm to 10 pm As a leader of product engineering teams, you will encourage and enable the use of leading software engineering best practices and iterative SDLC processes. You will attract the best software engineers and DevOps professionals to build usable and functional enterprise software. You will apply your problem solving and critical thinking to navigate business priorities, make delivery commitments and influence stakeholders Job Responsibilities: Facilitates Agile processes and tools to enable the effective communication of stories, requirements, acceptance criteria and progress in support of R1s software engineering objectives Can steer the team towards right technical direction for required solution in her/his span Manage technology transformation in a rapidly changing environment with ability to provide technology leadership to software engineering teams for mission critical applications Works with product management, business stakeholders and architecture leadership to understand software requirements. Helps shape, estimate and plan product roadmaps and generates release plans Monitors teamwork completion rate, defect rates, code coverage, cycle times and other product engineering KPIs. Understands development risks and technical debt and makes process improvement recommendations to the team, management and business stakeholders Fosters an environment of accountability between engineering team members and between engineering teams and business stakeholders Determines hiring plan and recruits, motivates and leads the best software engineering, DevOps and QA talent Mentors and develops the skills of members of the development team, cultivating a culture of learning Advises and contributes to the user experience, architecture and test-driven development of product features and functionality Serves as the point of escalation for team concerns and engineering obstacles. Provide solution architecture to cater to requirements balancing timelines and modern design principles Qualification and skills required: Bachelors degree in computer science, business or similar field 18+ years experience building web-based enterprise software using the Microsoft .NET stack with most of that experience leading engineering teams Knowledge of major cloud platform like Azure, AWS or GCP with experience in handling cloud transformation initiatives Experience in working in a pure DevOps environment Prior hands-on experience in developing code in Microsoft technologies - .NET, C#, ASP.NET, SQL Server, Cosmos DB, Azure native and Python, RabbitMQ or Kafka, Azure Databricks, ADF, SSIS, Angular or React Significant talent for handling stressful situations effectively, prioritizing work, meeting deadlines and motivating others Experience in recruiting, hiring and retaining the best software engineers, DevOps and QA professionals Key Success Criteria: Provide engineering leadership through technology, Agile processes, metrices and reporting Stakeholder management in India and US Lead knowledge transition and organization transformation
Posted 2 months ago
4 - 9 years
7 - 17 Lacs
Chennai, Pune, Bengaluru
Work from Office
Role & responsibilities **Big Data Lead** - Must-Have Skills: Pyspark, Databricks, SQL - Nice-to-Have Skills: ADF - Experience: 4-12 years - Work Locations: Bangalore/Chennai/Pune - Notice Period: Immediate - 30 days
Posted 2 months ago
1 - 6 years
0 - 3 Lacs
Ghaziabad, Delhi NCR, Lucknow
Work from Office
Role : Data Entry & Scanning Operator for Govt Project for 6yrs ,document scanning 20 Cr Experience in ADF Overhead,A0,Map Scanning Qualification mini 12 pass Not required good English , Only male Opening for Pan India candidate apply & relocate
Posted 2 months ago
2 - 4 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 2 to 4 Years of Oracle fusion experienceEducational Qualification Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 2 months ago
5 - 8 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year Of Experience Required Minimum 2 to 4 Years of Oracle fusion experienceEducational Qualification Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2