Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred technical and professional experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences
Posted 19 hours ago
5.0 years
8 - 12 Lacs
Hyderabad
Work from Office
When our values align, there's no limit to what we can achieve. At Parexel, we all share the same goal - to improve the world's health. From clinical trials to regulatory, consulting, and market access, every clinical development solution we provide is underpinned by something special - a deep conviction in what we do. Each of us, no matter what we do at Parexel, contributes to the development of a therapy that ultimately will benefit a patient. We take our work personally, we do it with empathy and we're committed to making a difference. Key Accountabilities : Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting. If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders. Excellent grasp of and expertise with test-driven development and continuous integration processes. Analysis and Design – Converts high-level design to low-level design and implements it. Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans. Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle Benchmark application code proactively to prevent performance and scalability concerns. Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management. Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments. Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple component. Knowledge and Experience : Understanding of design concepts and architectural basics. Knowledge of performance engineering. Understanding of quality processes and estimate methods. Fundamental grasp of the project domain. The ability to transform functional and nonfunctional needs into system requirements. The ability to develop and code complicated applications is required. The ability to create test cases and scenarios based on specifications. Solid knowledge of SDLC and agile techniques. Knowledge of current technology and trends. Logical thinking and problem-solving abilities, as well as the capacity to collaborate. Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO. Sought: SQL, Python, PowerBI. General Knowledge: PowerApps, Java. 3-5 years of experience in software development with minimum 2 years of cloud computing. Education: Bachelor of Science in Computer Science, Engineering, or related technical field.
Posted 21 hours ago
2.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus
Posted 3 days ago
15.0 - 20.0 years
17 - 22 Lacs
Pune
Work from Office
Oracle EPM Architect1 Oracle EPM Architect 15+ years (including minimum 5 years in Oracle EPM implementation/architecture) Seeking a highly skilled Oracle EPM Architect to lead the design, implementation, and management of Oracle Enterprise Performance Management (EPM) solutions. This role requires deep expertise in Oracle EPM Cloud and/or Hyperion stack, including strategic planning, solution design, and technical leadership across financial consolidation, planning, budgeting, and forecasting. Lead the architecture and design of Oracle EPM Cloud solutions, including modules such as: Planning and Budgeting Cloud Service (PBCS/EPBCS) Financial Consolidation and Close (FCCS) Enterprise Data Management (EDM) Profitability and Cost Management (PCM) Define and enforce best practices, integration standards, and governance models for EPM solutions. Engage with finance, IT, and business stakeholders to gather requirements and translate them into scalable EPM designs. Develop roadmaps, implementation strategies, and solution blueprints. Guide technical and functional consultants throughout the implementation lifecycle. Lead data integration efforts between Oracle EPM and ERP/other source systems. Ensure EPM solutions meet performance, security, compliance, and audit standards. Provide thought leadership in Oracle EPM innovations, product releases, and architecture trends. Support migration from on-premise Hyperion applications to EPM Cloud (if applicable). Conduct architecture reviews, performance tuning, and code quality assurance. Support post-go-live activities including training, documentation, and optimization.
Posted 3 days ago
3.0 - 7.0 years
6 - 10 Lacs
Mumbai
Work from Office
Senior Azure Data Engineer ? L1 Support
Posted 3 days ago
3.0 - 7.0 years
5 - 9 Lacs
Gurugram
Work from Office
Develop partnerships with key stake holders in HR to understand the strategic direction, business process, and business needs Should be well versed with AGILE / Scrum / Devops. Create technical solutions to meet business requirements Help Finance business users adopt best practices Excellent Verbal & written communication skills. Define user information requirements in Oracle E-Business Suite Implement plans to test business and functional processes Manage Test Scripts that support Oracle R12 financial applications Lead technical acceptance testing (Unit, SIT, and QAT) of patches and upgrades Deliver training content to users. Candidate must be ready to work from office daily and in shifts if required. NO Work From Home allowed. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Preferred technical and professional experience
Posted 4 days ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Fabric Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with organizational goals, ensuring that the solutions provided are effective and efficient. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: Lead and manage a team of data engineers, providing guidance, mentorship, and support.Foster a collaborative and innovative team culture. Work closely with stakeholders to understand data requirements and business objectives.Translate business requirements into technical specifications for the Data Warehouse.Lead the design of data models, ensuring they meet business needs and adhere to best practices.Collaborate with the Technical Architect to design dimensional models for optimal performance.Design and implement data pipelines for ingestion, transformation, and loading (ETL/ELT) using Fabric Data Factory Pipeline and Dataflows Gen2.Develop scalable and reliable solutions for batch data integration across various structured and unstructured data sources.Oversee the development of data pipelines for smooth data flow into the Fabric Data Warehouse.Implement and maintain data solutions in Fabric Lakehouse and Fabric Warehouse.Monitor and optimize pipeline performance, ensuring minimal latency and resource efficiency.Tune data processing workloads for large datasets in Fabric Warehouse and Lakehouse.Exposure in ADF and DataBricks Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Fabric.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
5.0 - 8.0 years
5 - 10 Lacs
Kolkata
Work from Office
Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com.In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience.The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Qualification Any Graduation,12th/PUC/HSC
Posted 4 days ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
2 Data Engineer Azure Synapse/ADF , Workiva To manage and maintain the associated Connector, Chains, Tables and Queries, making updates, as needed, as new metrics or requirements are identified Develop functional and technical requirements for any changes impacting wData (Workiva Data) Configure and unit test any changes impacting wData (connector, chains, tables, queries Promote wData changes
Posted 4 days ago
1.0 - 4.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Req ID: 321505 We are currently seeking a Test Analyst to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Understand business requirements , develop test cases. "¢ Work with tech team and client to validate and finalise test cases.. "¢ Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects "¢ Run in testing phase "“ SIT and UAT "¢ Test Reporting & Documentation "¢ Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required"¢ Test Cases development "¢ Jira knowledge for record test cases, expected results, outcomes, assign defects) "¢ Test Reporting & Documentation "¢ Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional)
Posted 6 days ago
1.0 - 4.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Req ID: 321498 We are currently seeking a Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. "¢ Work closely with Data modeller to ensure data models support the solution design "¢ Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. "¢ Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. "¢ Develop documentation and artefacts to support projects Minimum Skills Required"¢ ADF "¢ Fivetran (orchestration & integration) "¢ SQL "¢ Snowflake DWH
Posted 6 days ago
3.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Mandatory Skills: Azure DevOps, CI/CD Pipelines, Kubernetes, Docker, Cloud Tech stack and ADF, Spark, Data Bricks, Jenkins, Build Java based application, Java Web, GIT, J2E. -To design and develop automated deployment arrangements by leveraging configuration management technology. -Implementing various development, testing, automation tools, and IT infrastructure. -Selecting and deploying appropriate CI/CD tools. Required Candidate profile -Implementing various development, testing, automation tools, and IT infrastructure. -Selecting and deploying appropriate CI/CD tools.
Posted 1 week ago
5.0 - 10.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred technical and professional experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Mumbai
Work from Office
Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision
Posted 1 week ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Data Engineering Pipeline Development Design implement and maintain ETL processes using ADF and ADB Create and manage views in ADB and SQL for efficient data access Optimize SQL queries for large datasets and high performance Conduct end-to-end testing and impact analysis on data pipelines Optimization Performance Tuning Identify and resolve bottlenecks in data processing Optimize SQL queries and Delta Tables for fast data processing Data Sharing Integration Implement Delta Share, SQL Endpoints, and other data sharing methods Use Delta Tables for efficient data sharing and processing API Integration Development Integrate external systems through Databricks Notebooks and build scalable solutions Experience in building APIs (Good to have) Collaboration Documentation Collaborate with teams to understand requirements and design solutions Provide documentation for data processes and architectures
Posted 1 week ago
2.0 - 8.0 years
6 - 10 Lacs
Kolkata, Mumbai, Hyderabad
Work from Office
- PowerBI and AAS expert (Strong SC or Specialist Senior) - Should have hands-on experience of Data Modelling in Azure SQL Data Ware House and Azure Analysis Service - Should be able twrite and test Dex queries - Should be able generate Paginated Reports in PowerBI - Should have minimum 3 Years' working experience in delivering projects in PowerBI ROLE 2 : - DataBricks expert (Strong SC or Specialist Senior) - Should have minimum 3 years' working experience of writing code in Spark and Scala ROLE 3: - One Azure backend expert (Strong SC or Specialist Senior) - Should have hands-on experience of working with ADLS, ADF and Azure SQL DW - Should have minimum 3 Year's working experience of delivering Azure projects
Posted 1 week ago
4.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Your Role As a senior software engineer with Capgemini, you should have 4 + years of experience in Azure Data Engineer with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile Experience with Azure Data Bricks, Data Factory Experience with Azure Data components such as Azure SQL Database, Azure SQL Warehouse, SYNAPSE Analytics Experience in Python/Pyspark/Scala/Hive Programming. Experience with Azure Databricks/ADB Experience with building CI/CD pipelines in Data environments Primary Skills ADF (Azure Data Factory) OR ADB (Azure Data Bricks) Secondary Skills Excellent verbal and written communication and interpersonal skills Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 1 week ago
5.0 - 10.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Roles and Responsibilities 1 Ensure smooth running of oracle modules 2 User support for Business transactions 3 New development and maintenance of Oracle apps Form, Report, Alert, Workflow, Common functions etc. , Oracle Apps Techical, DellBoomi ,Oic,python and apex are added Advantage Apex experience added advantage. (For Kolkata position not required) 4 Acquire functional knowledge of areas of work, 5 Design, develop, implement the custom functionality 6 Prepare and update the documentation of existing and new functionalities 7 Maintenance of existing custom components 8 Train, guide and help new team members and project trainees in technical areas 9 Work on tuning so that application performance gets better 10 Chase with oracle SR team for technical error in oracle 11 currently work on a individual contribution Preferred candidate profile: Key skills Oracle ERP, Oracle ADF, OAF, JAVA, PLSQL, SQL, Python,Edf Integration Tools like Dell Boomi, OIC, PAAS DB, OTBI, would be added advantage BI Tool Qlik sense- would be added advantage
Posted 1 week ago
8.0 - 13.0 years
11 - 15 Lacs
Mumbai
Work from Office
8+ years of experience in Oracle EPM suite and delivering solutions on Planning Budgeting applications (ePBCS) and PCMCS applications Experience in Financial and workforce planning Worked on an end to end implementations from requirement gathering to user trainings Experience in support and performance improvement. Knowledge of PBCS rules Certification from Oracle will be added advantage
Posted 1 week ago
20.0 - 25.0 years
25 - 30 Lacs
Bengaluru
Work from Office
JOB TITLE/ POSITION Oracle Fusion Implementation and Maintenance Program Director REGION/ FUNCTION ERP ORGANISATION ENTITY Technology LOCATION Bengaluru JOB SUMMARY We are seeking an experienced Oracle Fusion Implementation & Maintenance Program Director to lead the strategic planning, execution, and governance of a large-scale Oracle Fusion Cloud transformation initiative. This role will have executive oversight of program delivery across multiple workstreams including ERP, HCM, EPM, and other Fusion Cloud modules, PaaS, Custom Bolt-ons & Third-party App. Integrations ensuring alignment with business goals, stakeholder expectations, and organizational readiness. ROLES AND RESPONSIBILITIES Strategic Leadership Own and drive the Oracle Fusion Cloud transformation strategy from initiation through go-live and post-implementation stabilization. Program Governance Establish and lead governance frameworks, executive steering committees, and decision-making forums. Program Management Develop and maintain detailed project plans, timelines, milestones, and deliverables using appropriate project management tools. Drive requirements gathering, solution design, testing, training, deployment, and post-go-live support. Cross-Functional Alignment Ensure alignment between business objectives, IT capabilities, and transformation goals across Finance, HR, and other functional areas. Vendor & Partner Management Oversee relationships with Oracle, system integrators, consultants, and other third-party providers. Stakeholder Engagement Serve as the point of contact for department heads, and global/regional stakeholders. Risk & Compliance Oversight Ensure program execution adheres to regulatory, security, and compliance standards. Financial Management Own the program budget, track ROI, and ensure cost-effective execution of implementation & maintenance phases. Organizational Change Management Champion change management, user adoption, and communication strategies across the enterprise. Quality Assurance Ensure robust QA, testing, data migration, and cutover planning to minimize disruption during go-live. Executive Reporting Provide regular updates to executive leadership, identifying progress, risks, mitigation plans, and success metrics. QUALIFICATIONS Bachelor's/Master s degree in Information Systems, Business Administration, or related field (MBA or equivalent preferred). Program/Project Management & Oracle Fusion Certifications will be an added advantage WORK EXPERIENCE 20+ years of IT program or project leadership experience, with at least 7 years focused on Oracle Fusion Cloud ERP, HCM, or EPM implementations. Good Experience working with Fusion Technologies like APEX, VBCS is desirable Demonstrated success in leading enterprise-scale cloud transformation or digital modernization initiatives. Experience in regulated or highly matrixed industries (e.g., healthcare, financial services). Experience with both global and multi-entity Oracle Fusion rollouts. Strong understanding of business processes in Finance, HR, Procurement, and Projects. Proficiency with project portfolio management tools and methodologies (Agile, Hybrid, or Waterfall). Certifications such as PMP, PgMP, or Oracle Cloud certifications are a plus. Excellent leadership, stakeholder management, negotiation, and communication skills.
Posted 1 week ago
5.0 - 8.0 years
3 - 6 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_2159_JOB Date Opened 14/03/2024 Industry Technology Job Type Work Experience 5-8 years Job Title Devops Engineer City Chennai Province Tamil Nadu Country India Postal Code 600004 Number of Positions 5 Mandatory Skills: Azure DevOps, CI/CD Pipelines, Kubernetes, Docker, Cloud Tech stack and ADF, Spark, Data Bricks, Jenkins, Build Java based application, Java Web, GIT, J2E. -To design and develop automated deployment arrangements by leveraging configuration management technology. -Implementing various development, testing, automation tools, and IT infrastructure. -Selecting and deploying appropriate CI/CD tools. Required Candidate profile -Implementing various development, testing, automation tools, and IT infrastructure. -Selecting and deploying appropriate CI/CD tools. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
3.0 - 5.0 years
2 - 5 Lacs
Mumbai
Work from Office
Job Information Job Opening ID ZR_1763_JOB Date Opened 23/03/2023 Industry Technology Job Type Work Experience 3-5 years Job Title OIC Tech Developer City Mumbai Province Maharashtra Country India Postal Code 400079 Number of Positions 1 Main skill is VBCS with exposure to OIC.( Visual Builder Cloud Service Oracle Integration Cloud) Should have proficiency in OIC and VBCs. Should be good at writing at sql queries. Should have Experience in Maintenance and production support. Should be able to understand e2e system architecture. Good communication skills Should be able to meet SLAs as per agreement Should be flexible to learn new technologies Good to have knowledge of Agile way of working and JIRA usage Location : Mumbai, Bangalore, Hyderabad, Pune, Kolkata check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
5.0 - 8.0 years
5 - 9 Lacs
Mumbai
Work from Office
Job Information Job Opening ID ZR_1624_JOB Date Opened 08/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Azure ADF & Power BI Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 4 Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : OneStream Extensive Finance SmartCPM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Designer, you will be responsible for assisting in defining requirements and designing applications to meet business process and application requirements using OneStream Extensive Finance SmartCPM. Your typical day will involve collaborating with cross-functional teams and ensuring the delivery of high-quality solutions. Roles & Responsibilities: Collaborate with cross-functional teams to define requirements and design applications using OneStream Extensive Finance SmartCPM. Ensure the delivery of high-quality solutions that meet business process and application requirements. Provide technical guidance and support to team members. Stay updated with the latest advancements in OneStream Extensive Finance SmartCPM and related technologies. Professional & Technical Skills: Must To Have Skills:Extensive experience in OneStream Finance SmartCPM. Good To Have Skills:Experience in related technologies such as Hyperion, SAP BPC, or Oracle EPM. Strong understanding of financial planning and analysis processes. Experience in designing and implementing financial consolidation and reporting solutions. Experience in designing and implementing budgeting and forecasting solutions. Additional Information: The candidate should have a minimum of 5 years of experience in OneStream Extensive Finance SmartCPM. The ideal candidate will possess a strong educational background in finance, accounting, or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Bengaluru office. Qualifications BE
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Bengaluru
Work from Office
We are looking for Microsoft BI & Data Warehouse Lead to design, develop, and maintain robust data warehouse & ETL solutions using the Microsoft technology stack. The ideal candidate will have extensive expertise in SQL Server development, (ADF), Health insurance Provident fund
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2