Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 years
6 - 9 Lacs
Hyderābād
On-site
Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Our Procurement Cloud is the key offering from the Oracle Applications Cloud Suite. Procurement Cloud is a fast growing division within Oracle Cloud Applications and have a variety of customers starting from a leading fast-food joint to world's largest furniture maker. Procurement Cloud Development works on different sophisticated areas starting from a complex search engine to a time critical auctions/bidding process to core business functionalities like bulk order processing, just to name a few. As a member of our team, you will use the latest technologies, including JDeveloper, ADF, Oracle 12c Database, Oracle SQL, BPEL, Oracle Text, BC4J, web-services, and service oriented architectures (SOA). In addition to gaining this technical experience, you will also be exposed to the business side of the industry. Developers are involved in the entire development cycle, so you will have the chance to take part in activities such as working with the product management team to define the product’s functionality and interacting with customers to resolve issues. So are you looking to be technically challenged and gain business experience? Do you want to be part of a team of upbeat, hard-working developers who know how to work and have fun at the same time? Well look no further. Join us and be the newest member of the Fusion Procurement Development! Skills/languages:: 1-8 years of experience in building Java based Applications. Good programming skills, excellent analytical/logical skills. Able to craft a feature from end to end. Can think out of the box, has practical knowledge on the given technologies, can apply logic to tackle a technical problem though might not have the background on the same. Should be persistent in their efforts. Experience in BPEL, Workflow System, ADF, REST Implementation, AI/ML, Scrum processes is a plus. Required: Java, OOPS Concepts, JavaScript/VBCS/JET Optional: JDBC, XML, SQL, PL/SQL, Unix/Linux, REST, ADF, AI/ML, Scrum Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.
Posted 1 week ago
10.0 - 19.0 years
8 - 9 Lacs
Thiruvananthapuram
On-site
10 - 19 Years 10 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
0 years
0 Lacs
Andhra Pradesh
On-site
To be responsible for data modelling, design, and development of the batch and real-time extraction, load, transform (ELT) processes, and the setup of the data integration framework, ensuring best practices are followed during the integration development. Bachelors degree in CS/IT or related field (minimum) Azure Data Engineer (ADF, ADSL, MS Fabric), Databricks Azure DevOps, Confluence About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : Years of experience: 8 to 12 years relevant experience Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments Hands on experience with: Spark for data processing (batch and/or real-time) Configuring Delta Lake on Azure Databricks Languages: SQL, pyspark, python Cloud platforms: Azure Azure Data Factory (must) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability Azure Devops (or similar tools) for source control & building CI/CD pipelines Experience designing and implementing large-scale distributed systems Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : Strong customer management- own the delivery for Data track with customer stakeholders Continuous learning and improvement attitude Key Behaviors : Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 8 to 12 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Skill: Azure -ADB,ADF,Azure synapse, SQL,pyspark Experience:5-12yrs Location: PAN India Notice period: Immediate -30 days only Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Mohali district, India
On-site
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less
Posted 1 week ago
5.0 - 10.0 years
3 - 6 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Sr Data Engg- Detailed JD *(Roles and Responsibilities) Education: Bachelors degree in Computer Science or Engineering • Candidate should have 5+ years of experience as Data Engineering, or any related role to Data solutions. • Hands-on experience solutioning and implementing analytical capabilities using the Azure Data Analytics platform including, Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake. • Candidate should be capable to support in all the phases of Analytical Development from identification of key business questions, through Data Collection and ETL. • Good experience in Developing Data solutions in Lakehouse platforms like Dremio is an added benefit. • Strong knowledge of Data Modelling and Data Design is a plus • Microsoft Data Certification is a plus. Mandatory skills* Azure Data Factory Desired skills* Azure Data Factory, Data Modeling Domain* Financial Services Work Location - Any location WFO/WFH/Hybrid WFO - Hybrid Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No shifts Location- PAN India Yrs of Exp-5Yrs
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: · Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python · Collaborate with data analysts, data scientists, and product teams to understand data needs · Optimize queries and data models for performance and reliability · Integrate data from various sources, including APIs, internal databases, and third-party systems · Monitor and troubleshoot data pipelines to ensure data quality and integrity · Document processes, data flows, and system architecture · Participate in code reviews and contribute to a culture of continuous improvement Required Skills: · 4–6 years of experience in data engineering, data architecture, or backend development with a focus on data · Strong command of SQL for data transformation and performance tuning · Experience with Python (e.g., pandas, Spark, ADF) · Solid understanding of ETL/ELT processes and data pipeline orchestration · Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) · Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) · Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) · Basic Programming Skills · Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: · Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. · Exposure to enterprise solutions (e.g., Databricks, Synapse) · Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) · Background in real-time data streaming and event-driven architectures · Understanding of data governance, security, and compliance best practices · Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Role :Senior Dot Net Developer Experience: 8+ years Notice period: Immediate Location : Trivandrum / Kochi Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: • Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery • Integrate and support third-party APIs and external services • Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack • Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) • Participate in Agile/Scrum ceremonies and manage tasks using Jira • Understand technical priorities, architectural dependencies, risks, and implementation challenges • Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : • Microsoft Certified: Azure Fundamentals • Microsoft Certified: Azure Developer Associate • Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server , including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership • Strong problem-solving and debugging capabilities • Ability to write reusable, testable, and efficient code • Develop and maintain frameworks and shared libraries to support large-scale applications • Excellent technical documentation, communication, and leadership skills • Microservices and Service-Oriented Architecture (SOA) • Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Secondary Skills: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML) Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Head the Production Support division , ensuring uninterrupted operations of critical banking systems including Oracle FLEXCUBE , OFSAA , Oracle Financials , and regulatory platforms . Lead and manage a team of 30+ professionals , across L1 to L3 support tiers, overseeing incident management, root cause analysis, and service improvement initiatives. Ensure high system availability and strict adherence to SLAs in coordination with business, compliance, and infrastructure teams. Manage cross-departmental support including Corporate Banking, Finance, Risk, and Compliance. Handle incident escalations , prioritize issues based on business impact, and oversee timely resolution. Coordinate release and change management activities to minimize disruption to business operations. Regularly engage with business heads to align system capabilities with operational requirements. Define and enforce robust monitoring, alerting, and reporting frameworks for production systems. Champion automation and optimization initiatives to reduce manual dependencies and improve turnaround times. Corporate Banking: Managed and supported applications for Corporate Lending , Trade Finance (LC, BC, Bank Guarantees). Finance Department: Supported Oracle Financials (GL, AP, AR) including GL consolidation and monthly GST filings . Compliance Systems: Oversaw OFSAA AML/KYC , enabling AML alert generation and KYC scoring with reverse feeds to CBS. Risk Systems: Seamless operation for LOS,LMS,MF and oracle Regulatory Reporting: Managed ADF for centralized regulatory report generation and NACH for mandate registration and processing. Key Skills and Technical Expertise: Platforms: Oracle FLEXCUBE, OFSAA (AML/KYC, BASEL, ALM, LRM), Oracle Financials (GL/AP/AR), ADF, NACH Databases: Oracle 10g/9i/8i Tools: PL/SQL Developer, SQL Navigator, Toad, OBIEE Operating Systems: Windows, HP-UX, IBM AIX Methodologies: Agile, Waterfall, ITIL (preferred) Preferred Qualifications: Bachelor’s Degree in Engineering (Electronics/Computer Science preferred) Scrum Master Certification or equivalent Agile training Strong understanding of regulatory frameworks and compliance obligations in banking Leadership & Soft Skills: Proven ability to lead large, diverse teams (30+ members) Exceptional communication and stakeholder management Strong problem-solving and critical thinking skills High accountability, operational discipline, and performance focus Comfortable working in high-pressure, production environments Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Data Engineer (Azure) Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Must Have Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Ingestion and Integration. Microsoft Certified: Azure Data Engineer Associate Required Skills And Experiences 5-8 years of experience as a Data Engineer, focusing on Azure cloud services Bachelor’s degree in computer science, Information Technology, or related field. Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Storage. Strong SQL skills, including experience with data modeling, complex queries, and performance optimization. Ability to work independently and manage multiple tasks simultaneously. Familiarity with version control systems (e.g., Git) and CI/CD pipelines (Azure DevOps). Knowledge of Data Lake Architecture, Data Warehousing, and Data Modeling principles. Experience with RESTful APIs, Data APIs, and event-driven architecture. Familiarity with data governance, lineage, security, and privacy best practices. Strong problem-solving, communication, and collaboration skills. Skills: event-driven architecture,data transformation,modeling,data governance,azure devops,azure,azure cloud services,restful apis,sql,data warehousing,azure sql,azure data factory,data modeling,data security,ingestion,data lake architecture,data privacy,synapse,etl-ssis,data apis,integration,data,data lake,adf,sql server,data lineage Show more Show less
Posted 1 week ago
0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Key Responsibilities Graph Database Development: Design, develop, and maintain graph database schemas using Neo4j. Query Optimization: Optimize Neo4j queries for performance and efficiency. Data Processing & Analysis: Utilize Python, PySpark, or Spark SQL for data transformation and analysis. User Acceptance Testing (UAT): Conduct UAT to ensure data accuracy and overall system functionality. Data Pipeline Management: Develop and manage scalable data pipelines using Databricks and Azure Data Factory (ADF). Cloud Integration: Work with Azure cloud services and be familiar with Azure data engineering components. Desired Skills Strong experience with Neo4j and Cypher query language Proficient in Python and/or PySpark Hands-on experience with Databricks and Azure Data Factory Familiarity with data engineering tools and best practices Good understanding of database performance tuning Ability to work in fast-paced, client-driven environments Skills: azure,data engineering tools,neo4j,pyspark,azure data factory,spark sql,databricks,cloud,database performance tuning,cypher query language,python Show more Show less
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data • Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : 1) Years of experience: 8 to 12 years relevant experience 2) Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments 3) Hands on experience with: a) Spark for data processing (batch and/or real-time) b) Configuring Delta Lake on Azure Databricks c) Languages: SQL, pyspark , python d) Cloud platforms: Azure e) Azure Data Factory (must ) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) f) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability g) Azure Devops (or similar tools) for source control & building CI/CD pipelines 4) Experience designing and implementing large-scale distributed systems 5) Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : • Strong customer management- own the delivery for Data track with customer stakeholders • Continuous learning and improvement attitude Key Behaviors : • Empathetic: Cares about our people, our community and our planet • Curious: Seeks to explore and excel • Creative: Imagines the extraordinary • Inclusive: Brings out the best in each other Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 8 to 12 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer with 5-10 Years of Experience to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory , and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview . The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ , etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012 Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Mars Data Hiring for Fulltime Dot Net Developer Positions in Trivandrum/Kochi locations Skills : .Net/.Net Core 6/8+/T-SQL/Azure Cloud Service/ Azure DevOps, React.JS/Angular.JS, C#, X-Unit, MS-Test, RDBMS, AWS, CI/CD, SDLC, Restful API, PowerShell, Agile/Scrum/Jira. Job Title: Dot Net Developer Location: Trivandrum/Kochi Job type: Full Time Working hours : 8 hours, Mid Shift Notice Period: Immediate Rel Experience : 10+ years Introduction Candidates with 10+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include • Develop, enhance, document, and maintain application features in .Net Core 6/8+ , C#, /REST API/T-SQL and AngularJS/React JS • Application Support & API Integrations with third party solutions/services • Understand technical project priorities, implementation dependencies, risks and issues • Participate and develop code as part of a unified development group, working the whole technological stack • Identify, prioritize and execute tasks in the software development life cycle • Work with the team to define, design, and deliver on new features • Broad and extensive knowledge of the software development life cycle (SDLC) with software development models like Agile, Scrum model, Jira models. Primary Skills • Develop high-quality software design and architecture • 10+ years of development experience in C#, .Net technologies, SQL and at least 2 years working with Azure Cloud Services • Expertise in C#, .Net Core 6.0/8.0 or higher, Entity framework, EF core, Microservices, Azure Cloud services, Azure DevOps and SOA • Ability to lead, inspire and motivate teams through effective communication and established credibility • Guide team to write reusable, testable, performant and efficient code • Proficient in writing Unit Test Cases using X-Unit, MS-Test • Build standards-based frameworks and libraries to support a large-scale application • Expertise in RDBMS including MS SQL Server with thorough knowledge in writing SQL queries, Stored Procedures, Views, Functions, Packages, Cursors & tables and object types. • Experience in large scale software development. • Prior experience in Application Support & API Integrations • Knowledge of architectural styles and design patterns, experience in designing solutions • Strong debugging and problem-solving skills • Effective communication skill, technical documentation, leadership and ownership quality Azure Skills • Azure Messaging services - Service Bus or Event Grid, Event hub • Azure Storage Account - Blobs, Tables, Queue etc • Azure Function / Durable Functions • Azure ADF and Logic APP • Azure DevOps - CI/CD pipelines (classic / YAML) • Application Insights, Azure Monitoring, KeyVault and SQL Azure Secondary Skills • Good knowledge of JavaScript, React JS, jQuery, Angular and other front end technologies • API Management - APIM • Azure Containerization and Container Orchestration Contact #8825984917 send your resume to hr@marsdata.in Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Result Areas And Activities Design, develop and deploy ETL/ELT solutions on premise or in the cloud Transformation of data with stored procedures Report Development (MicroStrategy/Power BI) Create and maintain comprehensive documentation for data pipelines, configurations, and processes Ensure data quality and integrity through effective data management practices Monitor and optimize data pipeline performance Troubleshoot and resolve data-related issues Technical Experience Must Have Good experience in Azure Synapse Good experience in ADF Good experience in Snowflake & Stored Procedures Experience with ETL/ELT processes, data warehousing, and data modelling Experience with data quality frameworks, monitoring tools, and job scheduling Knowledge of data formats like JSON, XML, CSV, and Parquet English Fluent (Strong written, verbal, and presentation skills) Agile methodology & tools like JIRA Good communication and formal skills Good To Have Good experience in MicroStrategy and PowerBI Experience in scripting languages such as Python, Java, or Shell scripting Familiarity with Azure cloud platforms and cloud data services Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 3+ years of experience in Azure Synapse Qualities Experience with or knowledge of Agile Software Development methodologies Can influence and implement change; demonstrates confidence, strength of conviction and sound decisions. Believes in head-on dealing with a problem; approaches in logical and systematic manner; is persistent and patient; can independently tackle the problem, is not over-critical of the factors that led to a problem and is practical about it; follow up with developers on related issues. Able to consult, write, and present persuasively Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and architectures, databases. The ideal candidate will have hands-on experience in Azure Data Factory (ADF), Azure Synapse, and SQL, and will support reporting teams by ensuring reliable, curated datasets are available for downstream analysis. Key Responsibilities: Design, develop, and manage scalable data pipelines to ingest and transform data from multiple source systems using Azure Data Factory (ADF) and Synapse Analytics. Build and maintain data integration workflows that enable downstream analytics, ensuring data is clean, consistent, and timely. Partner with power bi developer and analysts within team and cross functional to develop data models optimized for reporting and machine learning. Implement data validation and quality checks to detect anomalies or pipeline failures early, ensuring high data reliability. Maintain and improve existing ETL frameworks, enabling incremental data loads, performance tuning, and reusability of components. Collaborate with the Power BI development team to ensure data availability in curated formats that support self-service analytics. Write and maintain advanced SQL scripts, stored procedures, and views to transform and manipulate large datasets efficiently. Monitor pipeline performance and implement logging, alerting, and retry mechanisms to handle failures. Stay updated on Azure tools and recommend newer approaches (e.g., Dataflows, Delta Lakes) where beneficial. Show more Show less
Posted 2 weeks ago
6.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL, Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 6 to 9 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Our Procurement Cloud is the key offering from the Oracle Applications Cloud Suite. Procurement Cloud is a fast growing division within Oracle Cloud Applications and have a variety of customers starting from a leading fast-food joint to world's largest furniture maker. Procurement Cloud Development works on different sophisticated areas starting from a complex search engine to a time critical auctions/bidding process to core business functionalities like bulk order processing, just to name a few. As a member of our team, you will use the latest technologies, including JDeveloper, ADF, Oracle 12c Database, Oracle SQL, BPEL, Oracle Text, BC4J, web-services, and service oriented architectures (SOA). In addition to gaining this technical experience, you will also be exposed to the business side of the industry. Developers are involved in the entire development cycle, so you will have the chance to take part in activities such as working with the product management team to define the product’s functionality and interacting with customers to resolve issues. So are you looking to be technically challenged and gain business experience? Do you want to be part of a team of upbeat, hard-working developers who know how to work and have fun at the same time? Well look no further. Join us and be the newest member of the Fusion Procurement Development! Skills/languages:: 1-8 years of experience in building Java based Applications. Good programming skills, excellent analytical/logical skills. Able to craft a feature from end to end. Can think out of the box, has practical knowledge on the given technologies, can apply logic to tackle a technical problem though might not have the background on the same. Should be persistent in their efforts. Experience in BPEL, Workflow System, ADF, REST Implementation, AI/ML, Scrum processes is a plus. Required: Java, OOPS Concepts, JavaScript/VBCS/JET Optional: JDBC, XML, SQL, PL/SQL, Unix/Linux, REST, ADF, AI/ML, Scrum Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description The main role of a Support engineer is to troubleshoot and resolve highly complex techno-functional problems. The key skills put to use on a daily basis are - high level of techno-functional skills, Oracle products knowledge, problem-solving skills, and customer interaction/service expertise. Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the areas - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands-on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalization. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem-solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilize the organizational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organizational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self-driven and result oriented Strong problem-solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3 Responsibilities As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, India Duration: 6 Months Email to Apply: navaneeta@suzva.com Contact: 9032956160
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Mumbai, Hyderabad, Pune
Work from Office
Job type Contract to HIRE 4-5 years of hands-on experience on GenRocket Tool. Strong SQL knowledge with ability to write complex queries (example Left, right joins etc..) 3. Strong knowledge in SQL server, MSSQL, Microsoft Azure, ADF and synapse for nata base validation 4. Intermediate knowledge on ETL Transformations, workflows, STTM mappings (source to Target Data mappings) 5. Strong Knowledge on powershell scripting. 6. Ability to test Data validation and Data Transformation from source to Target 7. Data validation: Validating Data sources, Extracting data, and applying transformation logic 8. Test planning & Execution: Defining testing scope, Prepares test cases and test conditions, Test Data preparation 9. Coordinating test activities with Dev, BA & DBA and conduct defect triage for resolution of issues 10. Test quality: Ensuring the quality of their work and the work of the development team 11. QA Test documentation: Creating and maintaining documentation of test plans, test deliverables document such as QA Estimates, RTM (requirement traceability matrix, Peer reviews, QA sign-off documents. 12. Hands experience working on ADO/JIRA for test management, reporting defects and Dashboards creation. 13. Ability to Identify and report the risks and provide the mitigation plans. Coordinate with internal and external teams for completion of activities. If you are interested in, please share the update profile with below details. Current CTC Expected CTC Notice Period Total Experience Relevant Experience Location Mumbai, Pune, Banglore,Hyderabad
Posted 2 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Azure Data Engineer – Databricks Required Technical Skill Set Data Lake architecture, Azure Services – ADLS, ADF, Azure Databricks, Synapse Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.