Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Job Description We are looking for a skilled and experienced Data Engineer to join our team and contribute to enhancement work in the Advanced Master Data Management (AMDM) space. The ideal candidate will have strong hands-on experience with Databricks, SQL, and Python, and must possess practical expertise in Azure Cloud services. You will play a key role in designing, developing, and optimizing scalable data pipelines and solutions to support enterprise-wide data initiatives. Responsibilities Develop and maintain scalable ETL/data pipelines using Databricks, SQL, and Python on Azure Cloud. Design and implement data integration workflows across structured and unstructured sources within the AMDM domain. Collaborate with data architects and business teams to translate requirements into efficient data solutions. Ensure data quality, consistency, and integrity across systems. Monitor, troubleshoot, and optimize data workflows in Azure-based environments. Leverage Azure services like Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure Key Vault as part of solution delivery. Qualifications 6+ years of hands-on experience in Data Engineering or ETL development. Strong proficiency in Databricks for distributed data processing and transformation. Advanced skills in SQL and Python for building and automating data workflows. Solid working experience with core Azure Data Services Experience with relational and non-relational database technologies (e.g., SQL Server, PostgreSQL, Oracle). Familiarity with Stibo or similar Master Data Management (MDM) tools.
Posted 2 months ago
8.0 - 11.0 years
8 - 12 Lacs
Gurugram
Work from Office
Job Title: SAP CPI consultant (Data Services) Location: Chennai, Bangalore, Hyderabad, Pune, Gurugram, Kolkata, Noida Experience: 8-11 Years Job Description: We are seeking an experienced and proficient SAP CPI consultant (Data Services) to join our team at our Bengaluru office. The ideal candidate will act as a Subject Matter Expert (SME), leading and collaborating with cross-functional teams to ensure the effective implementation and support of SAP GTS solutions. Roles Responsibilities: Serve as a Subject Matter Expert (SME) in SAP GTS, providing guidance and oversight on solution design and implementation. Collaborate with and manage team members to drive high performance and accountability. Take ownership of critical team decisions, ensuring alignment with business goals. Engage with multiple cross-functional teams and contribute to key strategic and technical decisions. Provide innovative and scalable solutions to both team-specific and cross-team technical challenges. Conduct and facilitate knowledge sharing sessions to upskill team members and promote best practices. Monitor project progress, ensuring the timely delivery of high-quality application features. Maintain continuous improvement in system performance and user satisfaction. Professional Technical Skills: Must-Have: Proficiency in SAP Global Trade Services (GTS) . Strong understanding of application development methodologies . Experience in the integration of SAP GTS with other enterprise applications . Familiarity with business process mapping and requirements gathering . Ability to troubleshoot and resolve application issues quickly and efficiently.
Posted 2 months ago
0.0 years
1 - 2 Lacs
Chennai
Work from Office
Greetings from Annexmed!!! Huge Openings for Data Analyst - Non-Voice Process (Freshers)- Chennai Desired Skill: * Typing Skill (Upper / Lower) * Qualification: Diploma or Any Degree * Passed Out Year 2022 To 2025. * Good Communication Skill. * Location: Candidates Must Reside Within 15Kms Radius Of The Office Location. Interview Time : 11:00AM to 5:00PM Interview Day: Monday To Friday. * Thursday Holiday (May -01-2025). Contact : Kamali HR 8939711311 Shift : Mid Shift
Posted 2 months ago
10.0 - 11.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Lead the GCP pillar within the Data Engineering CoE, establishing technical standards, best practices, and reusable accelerators for Google Cloud Platform data implementations. This role is critical for supporting high-value client engagements including Verizon and other GCP-focused opportunities in our pipeline. Key Responsibilities Develop architecture patterns and implementation accelerators for GCP data platforms Establish best practices for BigQuery, Dataflow, Dataproc, and other GCP data services Support pre-sales activities for GCP-based opportunities, with particular focus on Verizon Design migration pathways from legacy systems to GCP Create technical documentation and playbooks for GCP implementations Mentor junior team members on GCP best practices Work with cloud-agnostic platforms (Databricks, Snowflake) in GCP environments Build deep expertise in enterprise-scale GCP deployments Collaborate with other pillar architects on cross-platform solutions Represent the companys GCP capabilities in client engagements Qualifications 10+ years of data engineering experience with minimum 5+ years focused on GCP Deep expertise in BigQuery, Dataflow, Dataproc, and Cloud Storage Experience implementing enterprise-scale data lakes on GCP Strong know
Posted 2 months ago
12.0 - 15.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Lead the technical vision and strategy for the Data Engineering Center of Excellence across cloud platforms (GCP, Azure, AWS), cloud-agnostic platforms (Databricks, Snowflake), and legacy systems. This leadership role will establish architectural standards and best practices while providing pre-sales leadership for strategic opportunities. Key Responsibilities Define and drive the technical vision and roadmap for the Data Engineering CoE Establish cross-cloud architecture standards and best practices with emphasis on Azure, GCP and AWS Lead pre-sales activities for strategic opportunities, particularly AWS, Azure, GCP-focused clients Build the CoEs accelerator development framework Mentor and guide pillar architects across all platforms Drive platform selection decisions and integration strategies Establish partnerships with key technology providers, especially Cloud Define governance models for the CoE implementation Represent the organization as a technical thought leader in client engagements 12+ years of data engineering experience with 6+ years in leadership roles Deep expertise in Google Cloud Platform data services (BigQuery, Dataflow, Dataproc) Strong knowledge of other cloud platforms (Azure Fabric/Synpase, Data Factory ,
Posted 2 months ago
6.0 - 16.0 years
17 - 19 Lacs
Hyderabad
Work from Office
We are looking forward to hire SAP BTP Professionals in the following areas : 1. SAP BTP exposure Understanding SAP BTP: A solid grasp of SAP Business Technology Platform (BTP), including its core services like SAP HANA Cloud, SAP Integration Suite, SAP AI/ML, and SAP Mobile Services. SAP Cloud Foundry: Understanding Cloud Foundry as the application runtime environment for developing cloud-native applications in SAP BTP. 2. Cloud Application Programming (CAPM ) Core CAP Concepts: Understanding the key principles of the Cloud Application Programming Model, such as service definitions, entities, data models, service bindings, and business logic. Familiarity with CAP CDS (Core Data Services) for defining data models and CAP Node.js or CAP Java for implementing business logic. CAP CLI (Command-Line Interface): Ability to use CAP tools to scaffold, test, and deploy applications. 3. Programming Languages and Frameworks JavaScript/Node.js: Since CAP supports Node.js, knowledge of JavaScript and its Node.js environment is essential for developing backend services. Java: Some implementations of CAP applications are built with Java, so familiarity with Spring Boot and Java frameworks may be helpful. OData and REST APIs: CAP applications often expose data via OData or RESTful APIs, so understanding how to consume and expose data through these protocols is necessary. SAP HANA Database and CDS (Core Data Services) SAP HANA Knowledge: As SAP BTP and CAP are tightly integrated with SAP HANA, understanding how to interact with HANA database and its advanced features (like SQLScript, table functions, etc.) is crucial. CDS (Core Data Services): Experience with CDS to model data and create entities, views, and associations. This includes defining annotations for business logic, authorization, and UI capabilities. SAP Fiori and UI5 (Frontend Development) SAP Fiori: Knowledge of Fiori design principles and how to build modern, user-friendly UIs for SAP applications. SAP UI5: Familiarity with SAP UI5 (a framework for building responsive UIs), which integrates closely with CAP applications to provide front-end solutions. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 2 months ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Job Requirements: EXPERIENCE: 5 - 8 years preferred experience in a data engineering role. Minimum of 4 years of preferred experience in Azure data services (Data Factory, Databricks, ADLS, SQL DB, etc.) EDUCATION: Minimum Bachelors Degree in Computer Science, Computer Engineering or in "STEM " Majors (Science, Technology, Engineering, and Math) SKILLS/REQUIREMENTS: Strong working knowledge of Databricks, ADF. Expertise working with databases and SQL. Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github) Preferred Familiarity with Agile delivery methodologies Familiarity with NoSQL databases (such as MongoDB) preferred. Any experience on IoT Data Standards like Project Haystack, Brick Schema, Real Estate Core is an added advantage Ability to multi-task and reprioritize in a dynamic environment. Outstanding written and verbal communication skills ",
Posted 2 months ago
2.0 - 5.0 years
8 - 12 Lacs
Thane
Work from Office
Should be well versed with .Net and SQL technologies Should have prior experience of working on applications with APIs and interfaces Should be aware of AWS Data services and specifically for supporting cloud-based applications Should have an understanding of IT best practices for general insurance and applications Should have an understanding of ITIL processes like Incident, Problem and Change management Should have a good understanding of Insurance business Ability to lead internal as well as external vendor teams and focus the team on resolution and delivery Ability to understand and grasp the business need and work cohesively with implementation team
Posted 2 months ago
5.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
At Johnson & Johnson, we believe health is everything. Our strength in healthcare innovation empowers us to build a world where complex diseases are prevented, treated, and cured, where treatments are smarter and less invasive, and solutions are personal. Through our expertise in Innovative Medicine and MedTech, we are uniquely positioned to innovate across the full spectrum of healthcare solutions today to deliver the breakthroughs of tomorrow, and profoundly impact health for humanity. Learn more at Job Function: Data Analytics & Computational Sciences Job Sub Function: Data Engineering Job Category: Scientific/Technology All Job Posting Locations: Bangalore, Karnataka, India Job Description: Position Summary Johnson & Johnson MedTech is seeking a Sr Eng Data Engineering for Digital Surgery Platform (DSP) in Bangalore, India. Johnson & Johnson (J&J) stands as the worlds leading manufacturer of healthcare products and a service provider in the pharmaceutical and medical device sectors. At Johnson & Johnson MedTechs Digital Surgery Platform, we are groundbreaking the future of healthcare by harnessing the power of people and technology, transitioning to a digital-first MedTech enterprise. With a focus on innovation and an ambitious strategic vision, we are integrating robotic-assisted surgery platforms, connected medical devices, surgical instruments, medical imaging, surgical efficiency solutions, and OR workflow into the next-generation MedTech platform. This initiative will also foster new surgical insights, improve supply chain innovation, use cloud infrastructure, incorporate cybersecurity, collaborate with hospital EMRs, and elevate our digital solutions. We are a diverse and growing team, that nurture creativity, deep understanding of data processing techniques, and the use of sophisticated analytics technologies to deliver results. Overview As a Sr Eng Data Engineering for J&J MedTech Digital Surgery Platform (DSP), you will play a pivotal role in building the modern cloud data platform by demonstrating your in-depth technical expertise and interpersonal skills. In this role, you will be required to focus on accelerating digital product development as part of the multifunctional and fast-paced DSP data platform team and will give to the digital transformation through innovative data solutions. One of the key success criteria for this role is to ensure the quality of DSP software solutions and demonstrate the ability to collaborate effectively with the core infrastructure and other engineering teams and work closely with the DSP security and technical quality partners. Responsibilities Work with platform data engineering, core platform, security, and technical quality to design, implement and deploy data engineering solutions. Develop pipelines for ingestion, transformation, orchestration, and consumption of various types of data. Design and deploy data layering pipelines that use modern Spark based data processing technologies such as Databricks and Delta Live Table (DLT). Integrate data engineering solutions with Azure data governance components not limited to Purview and Databricks Unity Catalog. Implement and support security monitoring solutions within Azure Databricks ecosystem. Design, implement, and support data monitoring solutions in data analytical workspaces. Configure and deploy Databricks Analytical workspaces in Azure with IaC (Terraform, Databricks API) with J&J DevOps automation tools within JPM/Xena framework. Implement automated CICD processes for data processing pipelines. Support DataOps for the distributed DSP data architecture. Function as a data engineering SME within the data platform. Manage authoring and execution of automated test scripts. Build effective partnerships with DSP architecture, core infrastructure and other domains to design and deploy data engineering solutions. Work closely with the DSP Product Managers to understand business needs, translate them to system requirements, demonstrate in-depth understanding of use cases for building prototypes and solutions for data processing pipelines. Operate in SAFe Agile DevOps principles and methodology in building quality DSP technical solutions. Author and implement automated test scripts as mandates DSP quality requirements. Qualifications Required Bachelor s degree or equivalent experience in software or computer science or data engineering. 8+ years of overall IT experience. 5-7 years of experience in cloud computing and data systems. Advanced Python programming skills. Expert level in Azure Databricks Spark technology and data engineering (Python) including Delta Live Tables (DLT). Experience in design and implementation of secure Azure data solutions. In-depth knowledge of the data architecture - infrastructure, network components, data processing Proficiency in building data pipelines in Azure Databricks. Proficiency in configuration and administration of Azure Databricks workspaces and Databricks Unity Catalog. Deep understanding of principles of modern data Lakehouse. Deep understanding of Azure system capabilities, data services, and ability to implement security controls. Proficiency with enterprise DevOps tools including Bitbucket, Jenkins, Artifactory. Experience with DataOps. Experience with quality software systems. Deep understanding of and experience in SAFe Agile. Understanding of SDLC. Preferred Master s degree or equivalent. Proven healthcare experience. Azure Databricks certification. Ability to analyze use cases and translate them into system requirements, make data driven decisions DevOpss automation tools with JPM/Xena framework. Expertise in automated testing. Experience in AI and MLs. Excellent verbal and written communication skills. Ability to travel up to 10% of domestic required. Johnson & Johnson is an Affirmative Action and Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Posted 2 months ago
8.0 - 12.0 years
13 - 17 Lacs
Noida
Work from Office
Job Title: Data Architect Location: Jewar airport Noida Experience - 8+ Years Data Architect We are looking for a Data Architect to oversee our organizations data architecture, governance, and product lifecycle. The role focuses on managing data layers, maintaining data governance frameworks, and creating data products aligned with business objectives. Key Responsibilities: Design and maintain the Lakehouse architecture, including data lake setup and management. Create and maintain data products, ensuring their alignment with business needs. Develop and enforce data governance policies, including the maintenance of a data catalog. Design data models and define database development standards. Automate workflows using Python, CI/CD pipelines, and unit tests. Required Skills and Experience: Extensive experience in data architecture and data platform management. Expertise in data governance, data modeling, and database development. Proficiency in Python for automation and pipeline development. Familiarity with Azure data services and data processing pipelines.
Posted 2 months ago
0.0 - 5.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Engineer - DBT (Data Build Tool) Experience0-5 Years Location:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
Job Title: Technical Architect / Solution Architect / Data Architect (Data Analytics) ?? Notice Period: Immediate to 15 Days ?? Experience: 9+ Years Job Summary: We are looking for a highly technical and experienced Data Architect / Solution Architect / Technical Architect with expertise in Data Analytics. The candidate should have strong hands-on experience in solutioning, architecture, and cloud technologies to drive data-driven decisions. Key Responsibilities: ? Design, develop, and implement end-to-end data architecture solutions. ? Provide technical leadership in Azure, Databricks, Snowflake, and Microsoft Fabric. ? Architect scalable, secure, and high-performing data solutions. ? Work on data strategy, governance, and optimization. ? Implement and optimize Power BI dashboards and SQL-based analytics. ? Collaborate with cross-functional teams to deliver robust data solutions. Primary Skills Required: ? Data Architecture & Solutioning ? Azure Cloud (Data Services, Storage, Synapse, etc.) ? Databricks & Snowflake (Data Engineering & Warehousing) ? Power BI (Visualization & Reporting) ? Microsoft Fabric (Data & AI Integration) ? SQL (Advanced Querying & Optimization) ?? Looking for immediate to 15-day joiners!
Posted 2 months ago
2.0 - 5.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Summary: Kroll Agency and Trustee Services provides conflict-free, flexible, and highly efficient administrative and trustee services to the global loan and bond markets. As a leading independent service provider, we specialize in the administration of privately placed notes, restructuring situations, syndicated, bi-lateral and private credit transactions. Our team of industry leading experts coupled with our high touch service, speed of execution and 24/7 responsiveness sets us apart from other providers. To learn more, please visit https://www. kroll. com / en / services / agency-and-trustee-services We are currently hiring a position within the Data services process. This role will require us to work closely with Transaction Managers (Front Office), External Clients, KYC and the Operations team to ensure the portfolio administration tasks are completed. The individual will have responsibility for the completion of many key portfolio and transaction tasks and for reporting of aged items and ownership of resolution. As such they are required to be detail orientated, organized and able to maintain accurate and complete records at all times. The ideal candidate will be a proactive and meticulous critical thinker. Must possess attributes of sound judgement, tact, and diplomacy. Strong analytical skills with an ability to identify issues. Ability to act independently (decision making) and be a team player as well. Responsibilities: Static Management: Data input and maintenance on a proprietary loan administration platform. This task involves accurately entering and updating data related to loans in a specialized software system designed specifically for managing loan information. Monitor the Data services inbox for receipt of documents and queries. The aim is to promptly identify and address any incoming requests or issues that may require attention, ensuring that all communications are responded to in a timely manner. Collaborate with various teams within the organization to understand data flows and process. Working closely with different departments or teams within the organization to gain a comprehensive understanding of how data moves through various systems and processes. Ongoing maintenance of Lender/Borrower contact static set up in LIQ and other systems. This includes ensuring that contact details are accurate and that any changes in contact information are reflected across all platforms to facilitate efficient communication. Manage any ad-hoc tasks and ensure completion within expected TAT. It is necessary to handle various unplanned or one-off tasks that may arise unexpectedly. This requires effective time management and organizational skills to ensure that all such tasks are completed within the expected turnaround time (TAT), maintaining overall workflow efficiency. Functional knowledge of Loan IQ and Lending Domain is a plus. Having a solid understanding of the Loan IQ system and the overall lending domain is advantageous. Knowledge of building different types of payment instruction on Loan IQ application. Static set up of remittance instructions, customers and performing callback to confirm the payment details of borrower/lenders. Additionally, Perform Callback to confirm the admin details with the customers. It involves conducting follow-up calls to confirm payment details with borrowers and lenders, ensuring accuracy and preventing any potential discrepancies in transactions. Support Functions: Attend daily WIP call and update the team on workstreams. Effective collaboration ensures that all teams are aligned and that any potential bottlenecks or inefficiencies in data handling are identified and addressed. Working closely with the management accounting and operations teams to assist in resolving any queries arising from the payments and operations teams. Working with Management to assist in the production of the Monthly MIS Report. Managing ad-hoc transaction activity - assisting transaction management team to administer ad-hoc and unscheduled transaction activity as directed and in accordance with procedures. Reporting and Compliance - completing and delivering regular reports and action points to management and working to improve procedures and processes based on regular findings. Process Management: Demonstrate high regard for organization s policy & procedures. Demonstrate accountability & ownership. Identify, Analyze, Prioritize, Treat & Monitor Risks Effectively manage risk & business goals Manage process controls effectively. Requirements Bachelor s degree in commerce / finance or relevant experience Experience in Lending operations. Experience with loans systems such as Loan IQ Strong oral and written communication skills Ability to work overtime as needed to support the team and ensure critical work is performed. Flexible working in shifts Ability to manage sensitive and confidential information. #LI-IK1 #LI-Hybrid
Posted 2 months ago
2.0 - 4.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Skills: Devops,Data Services->TDM (Test Data Management),Agile Coach->Consulting Process (Agile) , DevOps->TOSCA , Automation testing Responsibilities A day in the life of an finserv As part of the finserv consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!
Posted 2 months ago
5.0 - 8.0 years
7 - 11 Lacs
Hyderabad
Work from Office
A catastrophe modeling job involves analyzing and assessing the potential impact of catastrophic events (e.g., natural disasters like earthquakes, floods, hurricanes) on assets, infrastructure, and populations. The role typically includes developing, refining, and applying mathematical models to predict and evaluate risks, helping companies (such as insurers or government agencies) prepare for and mitigate the financial impact of such events. Responsibilities may also include data analysis, scenario testing, and collaborating with cross-functional teams to inform risk management strategies. Proficiency in data science, programming, and a strong understanding of geophysical or environmental factors are often required. Skills Set Required (Mandatory) 5 to 8 Years experience Hands on experience on AIR (Touchstone / TS Re and CATRADER) software Experience in CAT Modeling Industry Should understand & interpret CAT Modeling losses. Understanding of policy structure (layers, limits, deductibles) and how it works in insurance industry Insurance & Re-insurance Subject, Underwriting concepts Attention to detail and superior communication skills. Experience in Open Market & Binder account processing & auditing Proficiency in Excel and SQL & Analytical skills Desirable Skill Set Required (Add-on): Writing Macro s using VB scripts, Underwriting concepts. The position in Data Services team offers an interesting range of responsibilities includes Cleansing, augmenting, Validating, preparing catastrophe model exposure data for different Line of Business, Applying Insurance & Re-insurance policy conditions, analysis of client exposure data against to different perils, quantifying natural catastrophe risk based on catastrophe modeling software and reviewing of work (accounts) done by analysts, Maintain clients Turn Around Time and quality all the time. Should understand & interpret the losses, Understanding of Touchstone product and database structure. Maintain/manage account log sheet, Assign the work to team members, Audit/review the accounts done by risk analysts, manage the workflow in absence of Team Lead/Manager, raising client queries, attention to detail and superior communication skills.
Posted 2 months ago
5.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
A catastrophe modeling job involves analyzing and assessing the potential impact of catastrophic events (e.g., natural disasters like earthquakes, floods, hurricanes) on assets, infrastructure, and populations. The role typically includes developing, refining, and applying mathematical models to predict and evaluate risks, helping companies (such as insurers or government agencies) prepare for and mitigate the financial impact of such events. Responsibilities may also include data analysis, scenario testing, and collaborating with cross-functional teams to inform risk management strategies. Proficiency in data science, programming, and a strong understanding of geophysical or environmental factors are often required. Skills Set Required (Mandatory): 5 to 8 Years experience Hands on experience on AIR (Touchstone / TS Re and CATRADER) software Experience in CAT Modeling Industry Should understand & interpret CAT Modeling losses. Understanding of policy structure (layers, limits, deductibles) and how it works in insurance industry Insurance & Re-insurance Subject, Underwriting concepts Attention to detail and superior communication skills. Experience in Open Market & Binder account processing & auditing Proficiency in Excel and SQL & Analytical skills Desirable Skill Set Required (Add-on): Writing Macro s using VB scripts, Underwriting concepts. The position in Data Services team offers an interesting range of responsibilities includes Cleansing, augmenting, Validating, preparing catastrophe model exposure data for different Line of Business, Applying Insurance & Re-insurance policy conditions, analysis of client exposure data against to different perils, quantifying natural catastrophe risk based on catastrophe modeling software and reviewing of work (accounts) done by analysts, Maintain clients Turn Around Time and quality all the time. Should understand & interpret the losses, Understanding of Touchstone product and database structure. Maintain/manage account log sheet, Assign the work to team members, Audit/review the accounts done by risk analysts, manage the workflow in absence of Team Lead/Manager, raising client queries, attention to detail and superior communication skills.
Posted 2 months ago
3.0 - 8.0 years
5 - 10 Lacs
Gurugram
Work from Office
Data Engineer - Immediate Joiner Do you love working with data and building scalable solutions that can handle large volumes of data? Are you passionate about helping companies make data-driven decisions and achieve their goals? If so, we are looking for a talented Data Engineer to join our team! We are Uptitude, a fast-growing start-up with a global client base, headquartered in London UK, and we are looking for someone to join us full time in our cool office in Gurugram. About Uptitude: Uptitude is a forward-thinking consultancy that specializes in providing exceptional data and business intelligence solutions to clients worldwide. Our team is passionate about empowering businesses with data-driven insights, enabling them to make informed decisions and achieve remarkable results. At Uptitude, we embrace a vibrant and inclusive culture, where innovation, excellence, and collaboration thrive. We are seeking a highly skilled Data Engineer to join our team in the next month. Responsibilities: Design and implement modern data architectures using Azure Data Services (Azure Data Lake, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, etc.). Develop batch data processing solutions, ETL processes, and automated workflows. Work collaboratively with data analysts, PowerBI developers, and stakeholders to understand data needs and deliver comprehensive data solutions. Ensure data solutions are scalable, repeatable, effective, and meet the expectations of business goals and strategies. Keep abreast with industry best practices. Troubleshoot and debug data issues, ensuring robust and error-free data pipelines. Requirements: Minimum of 3 years of experience as a Data Engineer or similar role with a proven track record working with Azure services (being certified as Azure Data Engineer Associate is an asset). Strong proficiency in SQL and experience with structured and unstructured data models. In-depth knowledge of Azure data services and tools (Azure SQL Database, Azure Data Factory, Data Lake, Databricks, Azure Synapse Analytics). Proficient in scripting language like Python Understanding of big data technologies (Hadoop, Spark) and working experience with PySpark. Experience in using Azure DevOps for implementing CI/CD pipelines is desirable. Excellent problem-solving abilities and strong analytical skills. Excellent communication and teamwork skills, with the ability to interact at all levels of the organization. Working knowledge of PowerBI is desirable. Company Values: At Uptitude, we embrace a set of core values that guide our work and define our culture. As a Data Engineer, you should align with these values: 1. Be Awesome: Strive for excellence in everything you do, continuously improving your skills and delivering exceptional results. 2. Step Up: Take ownership of challenges, be proactive, and seek opportunities to contribute beyond your role. 3. Make a Difference: Embrace innovation, think creatively, and contribute to the success of our clients and the company. 4. Have Fun: Foster a positive and enjoyable work environment, celebrating achievements and building strong relationships. Benefits: Uptitude values its employees and offers a competitive benefits package, including: Competitive salary commensurate with experience and qualifications. Private health insurance coverage. Offsite trips to encourage team building and knowledge sharing. Quarterly team outings to unwind and celebrate achievements. Corporate English Lessons with UK instructor If you are passionate about Data Engineering and want to be part of a team that is making a real impact, we want to hear from you! At Uptitude, we are committed to building a team of talented and passionate individuals who are dedicated to our mission and share our values of innovation, collaboration, and customer success. If this sounds like you, please apply today!
Posted 2 months ago
7.0 - 14.0 years
9 - 16 Lacs
Chennai
Work from Office
5+ years of hands on experience in ABAP development with a strong focus on RICEF objects (Reports, Interfaces, Conversions, Enhancements, and Forms). In depth knowledge of Object Oriented ABAP (OOABAP) programming for modern SAP development. Proven experience with the Clean Core Approach and implementing SAP standard solutions with minimal customizations, ensuring long term sustainability and ease of upgrades. SAP HANA expertise, including experience in developing AMDP procedures and CDS Views for complex requirements. Proficient in developing applications using the ABAP RESTful Application Programming Model (RAP). Strong experience with Core Data Services (CDS) Views and OData services for integration and data modeling. Experience working with SAP Fiori and SAP UI5, including the development of OData services to integrate backend with frontend UI applications. EML (Entity Manipulation Language) for advanced transactional scenarios in both standard and custom APIs. RAP based APIs (managed and unmanaged scenarios) with actions, authorization checks, and field validation in RAP APIs & CDS. Fiori based custom reports and message implementation in RAP. Deep understanding of SAP SD, MM, and FI modules and business processes. Excellent analytical and problem solving skills, with the ability to debug, troubleshoot, and resolve technical issues efficiently. Hands on experience with SAP NetWeaver Gateway for managing OData services. Strong communication and collaboration skills to work effectively with both technical and functional teams. Ability to handle complex end to end implementations and deliver high quality results in a fast paced environment
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google Cloud Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure effective communication within the team and stakeholders Professional & Technical Skills: Must To Have Skills:Proficiency in Google Cloud Data Services Strong understanding of cloud computing principles Experience with cloud-based application development Knowledge of data storage and processing in the cloud environment Additional Information: The candidate should have a minimum of 5 years of experience in Google Cloud Data Services This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members, analyzing requirements, and developing solutions to meet business needs. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior team members Conduct regular code reviews to ensure quality standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience in data modeling and database design Knowledge of SAP BusinessObjects reporting tools Hands-on experience in troubleshooting and debugging applications Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Hyderabad office A 15 years full-time education is required Qualifications 15 years full time education
Posted 2 months ago
8 - 13 years
13 - 18 Lacs
Mumbai, Hyderabad, Bengaluru
Work from Office
Impact: This position offers the opportunity to lead a team of highly motivated individuals and contribute to achieving the teams goals. You will lead efforts to improve data accuracy, completeness, and timeliness through collaboration, innovation, and the execution of ad-hoc projects, with a focus on acquiring and collecting public and private data. Through strong stakeholder relationships and a deep understanding of market dynamics, we position ourselves as trusted partners, equipping clients with the intelligence needed to navigate opportunities and risks in a competitive environment. Responsibilities: Formulate and implement data-driven strategies that balance technical and product knowledge, collaborating with multiple teams to create best-in-class solutions. Oversee and implement data quality projects that align with evolving business priorities, ensuring high standards of data integrity. Identify opportunities for new datasets within the market landscape and support the development of strategies to incorporate them into existing frameworks. Demonstrate empathy and support team members, especially during challenging times, promoting a culture of well-being and collaboration. Encourage team motivation, facilitate career progression discussions, and execute succession planning to nurture talent within the team. Enhance the technical skills of the team, preparing them for future growth and evolving industry demands. Establish SMART objectives for team members, actively manage performance, and communicate the Pay for Performance culture and its linkage to rewards. Track and communicate team performance metrics, including time utilization and quality statistics, while setting challenging benchmarks for resource efficiency. Mentor the team on industry trends and large-scale data projects, providing guidance on business initiatives. Manage short-term and long-term projects from resource planning to execution, collaborating closely with the Data Management team to ensure alignment and effectiveness. Drive constructive conversations with the leadership team and stakeholders across various locations, ensuring alignment on goals and expectations. Advocate for a culture of innovation by understanding processes and workflows, generating ideas to eliminate content gaps and establish best practices. Foster a lean mindset to improve operational efficiency. Ensure all critical timelines and requirements for business-as-usual workflows, KPIs, and projects are met, demonstrating problem-solving capabilities at all levels. As a people leader, embody and promote the organizations values, culture, and strategic objectives, setting an example for the team. What we are looking for: Prior leadership experience in data services, with a strong focus on people management. Knowledge or experience in the industry is preferred. In-depth understanding of the mechanics of the capital markets domain, with the ability to quantify trends impacting the industry and provide insightful analysis. Proven operational management skills with a keen attention to detail, gained within a respected data company, ensuring effective oversight of data quality and performance. Experience in introducing and monitoring Key Performance Indicators (KPIs) and performance metrics, facilitating continuous improvement and accountability within the team. Capacity to give and receive constructive feedback, providing coaching to team members to foster their professional growth and development. Exceptional oral and written communication skills, enabling clear articulation of complex data insights and fostering effective stakeholder engagement. Willingness to work across various shifts, including night shifts on a rotational or as-needed basis, demonstrating adaptability to meet business needs. Maintains high ethical standards both personally and professionally, ensuring transparency and integrity within the team. Strong collaboration skills with the ability to work effectively within cross-functional teams and build relationships with various stakeholders. Comfort with change management processes, adapting to evolving business needs and driving innovation within the team. Familiarity with additional analytical tools or programming languages that enhance data analysis capabilities. Experience in managing projects from inception to completion, including the ability to prioritize tasks and manage resources effectively. Understanding of cultural differences and the ability to navigate them effectively in a global work environment. Commitment to continuous learning and professional development in data analysis and emerging technologies. A results-oriented approach, focusing on achieving goals and delivering measurable outcomes. Preferred Qualifications: A minimum of 8 years of experience working closely with senior leaders and decision-makers, demonstrating the ability to influence and drive strategic initiatives. Proven experience in establishing and nurturing trust with business heads, fostering long-lasting business relationships that benefit both the organization and stakeholders. Comfort with a high degree of autonomy, effectively managing priorities from multiple internal and external stakeholders to achieve organizational goals. Basic knowledge of SQL and Generative AI is desirable, providing a foundation for data analysis and innovative solutions. Familiarity with data visualization tools, enabling effective communication of insights through visual storytelling. Possession of a Green Belt Certification and exposure to Lean concepts, indicating a commitment to process improvement and operational efficiency. Our People: We're more than 35,000 strong worldwideso we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. Were committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. Were constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to:? ?and your request will be forwarded to the appropriate person.? US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 10 - Officials or Managers (EEO-2 Job Categories-United States of America), DTMGOP103.2 - Middle Management Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 months ago
4 - 9 years
14 - 18 Lacs
Noida
Work from Office
Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture We"™re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.
Posted 2 months ago
4 - 9 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview In this role, We are seeking a Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.
Posted 2 months ago
5 - 10 years
22 - 27 Lacs
Hyderabad, Bengaluru
Work from Office
Location: Hyderabad, Bangalore - India Function: HV Product Management Requisition ID: 1033000 The Company We’re Hitachi Vantara, the data foundation trusted by the world’s innovators. Our resilient, high-performance data infrastructure means that customers – from banks to theme parks – can focus on achieving the incredible with data. ?? If you’ve seen the Las Vegas Sphere, you’ve seen just one example of how we empower businesses to automate, optimize, innovate – and wow their customers. Right now, we’re laying the foundation for our next wave of growth. We’re looking for people who love being part of a diverse, global team – and who get excited about making a real-world impact with data. The Team The VSP 360 team is focused on building an intelligent, hybrid cloud platform that integrates observability, automation, protection, and data insights. As part of this mission, we are expanding platform capabilities to include rich data services integrations that enhance visibility, governance, compliance, and cyber resilience. This team works cross-functionally with engineering, partner ecosystems, and customer-facing teams to deliver seamless experiences and actionable insights from a wide range of data services and third-party platforms. The Role As the Product Manager for Data Services within the VSP 360 platform, you will lead the strategy and execution for integrating a diverse set of data services that drive data intelligence, governance, and protection. This includes managing platform-level integrations with services such as data classification, data cataloging, PII detection, cyber resilience, and third-party data protection solutions. You’ll collaborate with internal and external stakeholders to define use cases, capture integration requirements, and drive partner enablement. Your role will focus on building scalable APIs and workflows that bring context-rich insights and automation to the forefront of hybrid cloud storage management. You will be responsible for managing the backlog in Aha!, coordinating cross-functional execution, and ensuring customer-facing outcomes around security, compliance, and operational efficiency. What You’ll Bring 5+ years of product management experience in data services, storage, or enterprise software Strong understanding of data classification, cataloging, governance, and PII/security frameworks Familiarity with cyber resilience concepts and tools Experience integrating third-party solutions (e.g., Commvault, Veeam) into a platform environment Proven ability to define APIs and workflows for data services integration Agile product management experience with tools like Aha!, Jira, or equivalent Ability to balance technical requirements with customer value and usability Strong collaboration and communication skills across product, engineering, and partners Strategic mindset with experience driving partner ecosystems and joint solutions Passion for delivering customer-centric solutions with measurable business impact About us We’re a global team of innovators. Together, we harness engineering excellence and passion for insight to co-create meaningful solutions to complex challenges. We turn organizations into data-driven leaders that can a make positive impact on their industries and society. If you believe that innovation can inspire the future, this is the place to fulfil your purpose and achieve your potential. #LI-SP7 Championing diversity, equity, and inclusion
Posted 2 months ago
2 - 3 years
11 - 16 Lacs
Noida
Work from Office
Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated toimproving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Role description As an Assistant Global Power Platform Developer at Arcadis, you will be responsible for designing, developing, and implementing custom solutions using Microsoft Power Platform. You will work closely with our four Business Areas and associated stakeholders to establish governance as well as guidance, training, and compliance guardrails. Role accountabilities: Work closely with Power Platform leads to oversee the usage and manage Power Platform. Work closely with Power Platform leads and business analysts to gather requirements and understand how to leverage Power Platform as a solution to business needs. Design, develop, secure, and extend Power Platform solutions and reusable solution components based on business requirements. Develop and maintain documentation related to the solutions you create. Design and develop UI/UX experiences that are engaging and user-friendly. Provide technical support as well as guidance to the business as needed through our Community of Practice or Ticketing Tool. Nurture the Community of Arcadis Power Platform Developers by providing best practices, guidelines, and guardrails through SharePoint articles, documents, training, and hosting Monthly Global Calls. Keep up to date with the latest Power Platform features and technologies. Align with, steward, and promote organizational best practices and governance (e.g., InfoSec, Privacy, Data Sovereignty, Access Rights & Permissions). Work with the wider team to define, create, and promote best practices to ensure organizational compliance. Collaborate with Global IT and business stakeholders to identify opportunities for process automation and optimization using Power Platform. Conduct thorough testing and quality assurance of developed solutions to ensure they meet requirements and are free of bugs and errors. Qualifications & Experience: Education : Bachelors degree in Information Technology, Computer Science, or a suitably related field. Technical Experience : Minimum of 2 years development experience working within Power Platform and Dataverse/Common Data Service. Holding a Microsoft CertifiedPower Platform Developer Associate certification is an advantage. Experience in designing, developing, and implementing Power Apps (Canvas and Model-Driven Apps), Dataverse, Power Automate (Cloud and Desktop flows), Power Pages, and/or Copilot Studio (formerly Power Virtual Agents). Experience with Microsoft Azure, Dynamics 365, and/or Microsoft 365 (SharePoint and Teams) is preferable. Experience with Azure DevOps and Power Platform pipelines to manage Power Platform solution lifecycle is preferable. Experience with GitHub is preferable. Development experience using JavaScript, JSON, TypeScript, C#, HTML, .NET, Azure, Microsoft 365, RESTful web services, ASP.NET and/or Power Query is an advantage. Methodology Experience : Experience working in the software development lifecycle using an Agile approach is preferable. Soft Skills : Excellent stakeholder management skills with experience working with a diverse set of stakeholders. Maintains an awareness of developing technologies and their application and takes responsibility for personal development. Familiarity with data modelling, database design, and data integration techniques. Strong problem-solving and analytical skills, with the ability to think creatively and innovate. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Why Arcadis? We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people. At Arcadis, you will have the opportunity to build the career that is right for you. Because each Arcadian has their own motivations, their own career goals. And, as a people rst business, it is why we will take the time to listen, to understand what you want from your time here, and provide the support you need to achieve your ambitions.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France