Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893
Posted 1 month ago
6.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Informatica Intelligent Cloud Service (IICS) – Cloud Data Integration Service Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do IICS Cloud Data Integration Service Developer: The IICS Cloud Data Integration Service Developer / Designer will be responsible for design & development of IICS Cloud Data Integration Service s using Data Synchronization Service (DSS), Data Replication Service (DRS), Mappings, Mapping Tasks and Task Flows Contribute to architecture and configuration management to ensure solutions are aligned with architecture and business objectives Design and implement IICS Cloud Data Integration framework for end to end integration solution involving various services and advanced SQL queries Develop and implement audit, monitoring, backup and archival solutions as a part of integration process to support customer requirements Debug and troubleshoot any performance, data issues or errors in mappings Create high-quality code leveraging the latest frameworks and guidelines Proactive participation in design meetings and daily status updates Create test scripts and execute the unit, module and system test for ETL processes and the resulting data contents Create detailed documents in adherence to the SDLC standards ensuring timely submission of different documents for approvals across different stages of project implementation Qualifications Required: Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA 3. to 6 years of experience in implementation of Data-warehouse/BI solutions Preferred: Should have strong problem solving and analytical capabilities Should be well versed with Data Integration and Data Warehousing concepts Should have development hands-on experience in core IICS Cloud Data Integration service components like Data Synchronization Service (DSS), Data Replication Service (DRS), Mappings, Mapping Tasks and Task Flows Should have worked in at least one RDBMS like Oracle, Teradata, SQL Server, DB2.Shell scripting in Linux/Unix environment is mandatory Should have good understanding of IICS Architecture, advanced transformations like Hierarchy Builder, JSON/XML parser, Mapplet, Rank, Union, SQL Should have good understanding of IICS parameterization, Rest API, Exception and Error handling methods and debugging Performance Improvement skills on IICS and RDBMS is desirable. Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing. Should have participated in at least 1 end-to-end data integration implementation for a cloud-based application like Salesforce.com, Redshift, Snowflake Datawarehouse, etc. Shell scripting in Linux/Unix environment is mandatory Performance Improvement skills on Informatica / RDBMS is desirable. Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing. Should be able to understand both ER and Dimensional models Should be self-starter in solution implementation with inputs from design documents Preferable to have experience to automation and scheduling tools Candidate should possess good communication Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300109
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893
Posted 1 month ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Informatica Intelligent Cloud Service (IICS) – Cloud Data Integration Service Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do IICS Cloud Data Integration Service Developer: The IICS Cloud Data Integration Service Developer / Designer will be responsible for design & development of IICS Cloud Data Integration Service s using Data Synchronization Service (DSS), Data Replication Service (DRS), Mappings, Mapping Tasks and Task Flows Contribute to architecture and configuration management to ensure solutions are aligned with architecture and business objectives Design and implement IICS Cloud Data Integration framework for end to end integration solution involving various services and advanced SQL queries Develop and implement audit, monitoring, backup and archival solutions as a part of integration process to support customer requirements Debug and troubleshoot any performance, data issues or errors in mappings Create high-quality code leveraging the latest frameworks and guidelines Proactive participation in design meetings and daily status updates Create test scripts and execute the unit, module and system test for ETL processes and the resulting data contents Create detailed documents in adherence to the SDLC standards ensuring timely submission of different documents for approvals across different stages of project implementation Qualifications Required: Education: Bachelors/Master’s degree in Computer Science / MCA / M.Sc / MBA 3. to 6 years of experience in implementation of Data-warehouse/BI solutions Preferred: Should have strong problem solving and analytical capabilities Should be well versed with Data Integration and Data Warehousing concepts Should have development hands-on experience in core IICS Cloud Data Integration service components like Data Synchronization Service (DSS), Data Replication Service (DRS), Mappings, Mapping Tasks and Task Flows Should have worked in at least one RDBMS like Oracle, Teradata, SQL Server, DB2.Shell scripting in Linux/Unix environment is mandatory Should have good understanding of IICS Architecture, advanced transformations like Hierarchy Builder, JSON/XML parser, Mapplet, Rank, Union, SQL Should have good understanding of IICS parameterization, Rest API, Exception and Error handling methods and debugging Performance Improvement skills on IICS and RDBMS is desirable. Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing. Should have participated in at least 1 end-to-end data integration implementation for a cloud-based application like Salesforce.com, Redshift, Snowflake Datawarehouse, etc. Shell scripting in Linux/Unix environment is mandatory Performance Improvement skills on Informatica / RDBMS is desirable. Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing. Should be able to understand both ER and Dimensional models Should be self-starter in solution implementation with inputs from design documents Preferable to have experience to automation and scheduling tools Candidate should possess good communication Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300109
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Sapiens is on the lookout for a Developer (ETL) to become a key player in our Bangalore team. If you're a seasoned ETL pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ L&P division, for more information about it, click here: https://sapiens.com/solutions/life-and-pension-software/ What You’ll Do Designing, and developing of core components/services that are flexible, extensible, multi-tier, scalable, high-performance and reliable applications of an advanced complex software system, called ALIS both in R&D and Delivery. Good understanding in the ETL Advanced concepts and administration activities to support R&D/Project. Experience in understanding of different ETL tools (min 4) and Advanced transformations, Good in Talend, SAP BODS to support R&D/Project To be able to resolve all ETL code and administration issue. Ability to resolve complex Reporting challenges. Ability to create full-fledged dashboards with story boards/lines, drill down, linking etc.; design tables, views or datamarts to support these dashboards Ability understand, propose data load strategies which improves performance and visualizations Ability to performance tune SQL, ETL & Report, Universes To understand Sapiens Intelligence Product and support below points Understands the Transaction Layer model for all modules Universe the Universe Model for all modules Should have End to End Sapiens Intelligence knowledge Should be able to independently demo or give training for Sapiens Intelligence product. Should be an SME in Sapiens Intelligence as a Product What To Have For This Position. Must have Skills. 3 - 5 years of IT experience. Should have experience to understand the Advanced concepts of Insurance and has good command over at least all Business / Functional Areas (like NB, claims, Finance etc,.) Should have experience with developing a complete DWH ETL lifecycle Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. - using ETL tools such as Talend, BODS, SSIS etc. Experience or knowledge in Bigdata related tools like (Spark, Hive, Kafka, Hadoop, Horton works, Python, R) would be good to go Should have experience in developing SAP BO or any Reporting tool knowledge Should be able to implement reusability, parameterization, workflow design, etc. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and Low/High level design documents Experience in understanding complex source system data structures – preferably in Insurance services (Insurance preferred) Experience in Data Analysis, Data Modeling and Data Mart design Strong database development skills like complex SQL queries, complex stored procedures Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Ability to work with minimal guidance or supervision in a time critical environment. Willingness to travel and work at various customer sites across the globe. About Sapiens Sapiens is a global leader in the insurance industry, delivering its award-winning, cloud-based SaaS insurance platform to over 600 customers in more than 30 countries. Sapiens’ platform offers pre-integrated, low-code capabilities to accelerate customers’ digital transformation. With more than 40 years of industry expertise, Sapiens has a highly professional team of over 5,000 employees globally. For More information visit us on www.sapiens.com . Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer : Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com . Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description Job Title: Business Intelligence Developer Location: Bangalore Experience: Minimum 4 years of relevant experience Notice Period: Maximum 30 days Location :Trivandrum ,Cochin Job Description We are seeking a skilled and experienced Business Intelligence (BI) Developer with over 6 years of total experience, including a minimum of 4 years in relevant BI technologies. The ideal candidate should be capable of independently handling BI development, reporting, and data modeling tasks, and comfortable interacting with C-suite executives and stakeholders. Key Responsibilities Develop and maintain BI applications using SQL Server, Salesforce (SF), Google Cloud Platform (GCP), PostgreSQL, Power BI, and Tableau. Strong understanding and hands-on experience with Data Modeling concepts: Dimensional & Relational modeling Star Schema and Snowflake Schema Fact and Dimension tables Translate complex business requirements into functional and non-functional technical specifications. Collaborate effectively with business stakeholders and leadership. Write optimized T-SQL, create User Defined Functions, Triggers, Views, Temporary Tables, and implement constraints and indexes using DDL/DML. Design and develop SSAS OLAP Cubes and write complex DAX expressions. Proficient in external tools like Tabular Editor and DAX Studio. Customize and optimize Stored Procedures and complex SQL Queries for data extraction and transformation. Implement Incremental Refresh, Row-Level Security (RLS), Parameterization, Dataflows, and Data Gateways in Power BI. Develop and maintain SSRS and Power BI reports. Optimize Power BI performance using Mixed Mode and Direct Query configurations. Key Skills Power BI Power BI Tools (DAX, Dataflows, etc.) Data Analysis Microsoft Fabric Skills Power Bi,Power Tools,Data Analysis,Fabric Show more Show less
Posted 1 month ago
12.0 - 15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
John Cockerill, enablers of opportunities Driven since 1817 by the entrepreneurial spirit and thirst for innovation of its founder, the John Cockerill Group develops large-scale technological solutions to meet the needs of its time: facilitating access to low carbon energies, enabling sustainable industrial production, preserving natural resources, contributing to greener mobility, enhancing security and installing essential infrastructures. Its offer to businesses, governments and communities consists of services and associated equipment for the sectors of energy, defence, industry, the environment, transports, and infrastructures. With over 6,000 employees, John Cockerill achieved a turnover of € 1,046 billion in 2023 in 29 countries, on 5 continents. www.johncockerill.com Position : CAD Designer (Skids) Job Location : Mumbai (India) Job Purpose Design assemblies and sub-assemblies in 3D and/or 2D, realization of manufacturing and assembly/assembly plans in the field of Hydrogen related to process skids, mechanics, electricity of sub-assemblies or assemblies of parts, with establishment of the corresponding nomenclatures, as part of the study of products or their manufacture, including prefabrication, manufacture and assembly and/or assembly. Key Responsibilities Receives technical input from the Design Engineer or the Technical Project Manager for the execution of the work. Starting from established specifications (contractual, codes and/or standards), produces complex overall Plans, sub-assemblies for Skid design. Produces (3D/2D) drawings of parts to allow manufacturing or tests/calculations for skid assemblies. Good understanding of various components of process skids (Vessels, heat exchangers, pumps, coolers, filters, pipes, fittings, valves, skid base frame etc. Carries out quantities of equipment, assemblies or sub-assemblies. Apply the rules, procedures and standards in force, internal or external to John Cockerill. Prepares documents, Participates in the technical follow-up at the level of the engineering study. Establish, in accordance with procedures, the answers to technical questions asked by internal customers. The candidate could be a contact person for site managers and/or manufacturing managers about technical problems. Participates in technical discussions, directly or indirectly, with the internal customer, partners. Integrates manufacturing, assembly and after-sales service feedback during order fulfillment to improve the product or correct deficiencies. Coordinates, directs and verifies the work of other designers. Ensures consistency between the different projects, both from a technical and methodological point of view Provides direct or indirect support to quality control, purchasing and sales. Establishment of BOMs and parameterization of the software (AutoCAD Plant 3D) Carry out calculations (tolerances, strength of materials, dimensions, weight, etc.) within the limits of the responsibilities entrusted by the design engineer. Participate in the technical follow-up of projects. Check the completeness of the inputs received from the Technical Project Manager and basic engineering department to proceed to their work. Creating orthometric drawing with AutoCAD Plant3D is an advantage. Respect the pipe stress analysis reports for piping hangers’ location. Issue the manufacturing documentation (mainly isometric drawings) in compliance with internal standards and contractual requirements. Provide the engineers with information needed for the project progress. Issue EBOM’s (Engineering Bill of Materials) for raw material procurement. Education And Experience You have a higher education Diploma in industrial and technical design. You have experience 12-15 years in 3D industrial design & detail Engineering Experience. You are also dynamic, motivated, willing, open-minded and have a sense of initiative. You are methodical, rigorous and concerned with quality in the execution of your tasks. You master the INVENTOR, AUTOCAD (AutoCAD PLANT 3D, Navisworks and Vault is an advantage) You have solid knowledge in Structure Modeling. Additional skills with Piping, P&ID, Equipment Modeling, GA & detail drawing is added advantage). Knowledgeable in Detailed piping isometrics extraction is an advantage. Knowledge of MS Office (Word, Excel) Understanding of professional environment in the Energy field Basic knowledge of usual codes / norms and good practice rules (ASME and EN) Mechanical drawings (cutting, machining, welding, assembly…) EBOM (Engineering Bill of Materials) Knowledge of mechanical properties of steel elements (ASME and EN) Reading of technical documentation Proficiency in English (Written / spoken). Knowledge of French language is an added advantage. Knowledge of Nx, Teamcenter is an advantage. Who We Are About John Cockerill John Cockerill is a global player in energy transition. With more than 200 years of experience in energy, industry and mobility, the company designs and integrates innovative technology to facilitate access to low-carbon energy. These technologies and associated expertise are dedicated to the production, storage, and distribution of electricity from renewable energy sources and to optimizing the efficiency of power plants. The technologies apply to steam-gas, hydraulic, hydrogen, solar, nuclear, wind and biomass energy. To complement its commitment to the fight against climate change, John Cockerill is also deploying solutions to contribute to greener mobility, to produce responsibly, to preserve natural resources and to fight against insecurity. In 2020 John Cockerill achieved a turnover of Euro 1.01 bn in 19 countries. John Cockerill, which is privately owned, employs 5,200 people worldwide, including more than 400 in India. Equal Opportunity Employer John Cockerill and all John Cockerill Companies are equal opportunity employers that evaluate qualified applicants without regard to race, color, national origin, religion, ancestry, sex (including pregnancy, childbirth, and related medical conditions), age, marital status, disability, veteran status, citizenship status, sexual orientation, gender identity or expression, and other characteristics protected by law. John Cockerill offers you career and development opportunities within its various sectors in a friendly working environment. Do you want to work for an innovative company that will allow you to take up technical challenges on a daily basis? We look forward to receiving your application and to meeting you! Discover our job opportunities in details on www.johncockerill.com Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: Responsibilities Develop technical expertise with 3GPP 5G SA SMF/UPF and LTE/NSA SGW/PGW and actively drive our 24x7 lab certification objective. Participate in Sprint Planning and design discussions with the development team to ensure that automation objectives are met. Effectively utilize project management tools such as Jira to track requirements, design, and deliveries of the project. Develop expertise in the Automated Network Testing (ANT) platform and become an advocate for its use in both a lab and production environment. Improve productivity for the overall team by proposing tools or processes that optimize our quality and rate of delivery. Understand the business objectives of our team, organization, business unit, and company. Create End-to-End (E2) test cases and post-validation templates using ANT. Create Command-Line-Interface (CLI) test case template health checks using ANT and regular expressions. Collaborate with platform teams, testers and infrastructure teams to troubleshoot automation, tooling and mobility infrastructure networks. Participate in script lifecycle management including development, delivery, deployment, updates, defect management. Basic Qualifications Expert level working experience with 3GPP Mobile Packet Core architecture, interfaces and call flows. Specific focus on LTE/NSA SGW/PGW and 5G SA SMF/UPF. Hand on experience on TCP/IP, UDP/QUIC protocols, SSL/TLS Encryptions etc. Good understanding of Linux, Firewall (NAT/PAT etc.), networking and routing protocols (e.g. Ethernet, VLAN, BGP etc.). Expert level working experience troubleshooting mobility infrastructure and services. Expert level working experience in creation and maintenance of Azure and Azure DevOps (ADO) concepts including specific experience with: Demonstrated ability to develop ADO Pipelines from the ground up (particularly yaml expressed pipelines). Demonstrated use of az cli commands within ADO Pipelines as a method to interact with Azure Resources. Comprehension of the purpose and limitations of ADO agents. ADO Repos and expanding understanding not only to the purpose but more specifically on branching and tagging methods to optimize code stored in repositories to handle the different environments. Understanding the capabilities of Azure Keyvault (AKV). Use of Azure Container Registry (ACR) to maintain images and charts through tools used within the pipelines to import, tag, and extract artifacts. Understanding of Azure Resource Manager (Endpoints and Service Principals). Understanding of Bicep and ARM Templates to build Azure Resources with particular emphasis on parameterization. Demonstrated use of Azure Operator Service Manager (AOSM). Ability to articulate concepts, solutions, and standards to members of the project with various levels of skills sets through presentations and/or discussions. Proficient in Python development. 3+ Years of experience working with Regular Expressions (RegEx). 3+ Experience in K8s, Azure, Azure Dev Ops, CI/CD. Hands on experience developing and maintaining automation scripts following Object Oriented Programming principles. Excellent communication, collaboration, reporting, analytical and problem-solving skills. Proficient with Agile testing methodologies and best practices. Working knowledge of Robot framework, Selenium, Java script, JSON, YAML files etc. Portfolio on GitHub or other platforms. Interest in learning new tools and technologies. Excellent communication skills (written and oral) and intercultural communication abilities. Must have a win as one attitude, to work as a team to carry out projects on time with quality. Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. RCE_Risk Data Engineer/Leads Description – External Job Description: - Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non- Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring the knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support using ground-breaking cloud and big data technologies. The ideal candidate with 3-6 years of relevant experience, will possess strong technical skills, an eagerness to learn, a keen interest on 3 keys pillars that our team support i.e. Financial Crime, Financial Risk and Compliance technology transformation, the ability to work collaboratively in fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role you will: Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases. Driving improvements in the reliability and frequency of data ingestion including increasing real-time coverage Support and enhancement of data ingestion infrastructure and pipelines. Designing and implementing data pipelines that will collect data from disparate sources across enterprise, and from external sources and deliver it to our data platform. Extract Transform and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulation data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service and customer along said data flow. Identifying and onboarding data sources using existing schemas and where required, conduction exploratory data analysis to investigate and provide solutions. Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities. Core/Must Have Skills. 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (DB: PLSQL) At least 4+ years of experience in Database Design and Dimension modelling using Oracle PLSQL. Should be experience of working PLSQL advanced concepts like ( Materialized views, Global temporary tables, Partitions, PLSQL Packages) Experience in SQL tuning, Tuning of PLSQL solutions, Physical optimization of databases. Experience in writing and tuning SQL scripts including- tables, views, indexes and Complex PLSQL objects including procedures, functions, triggers and packages in Oracle Database 11g or higher. Experience in developing ETL processes – ETL control tables, error logging, auditing, data quality etc. Should be able to implement reusability, parameterization workflow design etc. Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (Oracle, SQL Server, Neo4J) Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems. Strong understanding of ETL methodologies and best practices. Collaborate with cross-functional teams to ensure successful implementation of solutions. Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points. Good to have: Experience of working in Financial Crime, Financial Risk and Compliance technology transformation domains. Certification on any cloud tech stack. Experience building and optimizing data pipelines on AWS glue or Oracle cloud. Design and development of systems for the maintenance of the Azure/AWS Lakehouse, ETL process, business Intelligence and data ingestion pipelines for AI/ML use cases. Experience with data visualization (Power BI/Tableau) and SSRS. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
170.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Us: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities Job Title: TOSCA Automation Tester Experience: 4–7 Years Location: Noida Employment Type: Full-time ________________________________________ Job Summary: We are looking for a highly skilled TOSCA Automation Tester with 4–7 years of experience in automation testing, including a minimum of 3 years of hands-on experience with the TOSCA automation tool. The ideal candidate should have strong expertise in test data management, distributed execution, and web-based application testing using TOSCA. Experience with SAP modules and custom module creation using C# is a plus. ________________________________________ Key Responsibilities: • Design, develop, and execute automated test cases using TOSCA. • Utilize TDS (Test Data Service) and TDM (Test Data Management) for effective test data handling. • Perform test case parameterization using external data sources. • Execute distributed test runs using DEX (Distributed Execution). • Work on web-based applications and ensure comprehensive test coverage. • Manipulate and manage Excel data within TOSCA for test execution. • Collaborate with cross-functional teams to ensure quality deliverables. ________________________________________ Must-Have Skills: • 4–9 years of experience in automation testing. • Minimum 3 years of hands-on experience with TOSCA. • Proficiency in TDS, TDM, and Excel manipulation within TOSCA. • Experience with DEX for distributed execution. • Strong understanding of test case parameterization. • Experience testing web-based applications using TOSCA. ________________________________________ Good-to-Have Skills: • Experience in TOSCA custom module creation using C#. • Exposure to various SAP modules such as OTC, FI, CO, SD, MM, ML. • Understanding of SAP end-to-end functional processes. • Familiarity with SAP platforms like SAPCP, FIORI, S/4 HANA, SAPGUI. • Willingness to work in shifts as required. Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
India
On-site
Profile Working with our Customers to drive the Supply Chain Planning roadmap Expertise in the processes such as Demand Planning, Supply Planning, Procurement Plan, Rough Cut capacity Planning for CPG, QSR, Retail and Manufacturing industries Defining KPIs to measure the improvement before and after the solution deployment Lead the presales for Supply Chain related use cases Contribute towards building of pre-built asset to address various challenges in the Supply Chain Planning process Collect, understand and communicate the business requirements for the project, and translating these into functional specifications. Demand and Supply chain planning (Demand Planning, Distribution, Production and/or Procurement Planning). Set-up supply chain planning solution and define/deploy/test functional specification into the application (as the configuration of user views, parameterization of algorithms, optimization ) and assist in the preparation of user and execute system test plans Support knowledge transfer to other project team members and support organization and performance of tests. Assist the customer in going live on the supply chain planning solution and provide support to the customer. Have an attitude to continuously grow his/her knowledge of customer requirements and business processes according to industry best practices having a clear understanding of the users’ needs and the benefits the solution. Develop close relationships with the customer to understand the level of service delivered Requirements: Business degree in Logistics and Supply Chain from an accredited University 3-5 years’ experience in Supply Chain Planning and/or Consulting (Preferred both) Track record in Supply Chain sector as a functional and/or technical consultant with demand planning or production/supply planning experience/exposure Anaplan Solution architect with Anaplan Optimizer implementation experience for atleast 2 projects Experience in Supply Chain Management systems and tools implementation (preferable Demand Management and or production/supply planning areas/ solutions – JDA, Demand Solutions ) Knowledge of enterprise business applications and/or main ERP Software modules Usage of database/technical engine is preferable. Soft skills: Analysis, listening aptitudes, problems solving and workaround capabilities with the ability to propose the appropriate solution/approach/process. Knowledge and experience with R programming or other statistical tools is a plus (Python, SAS) Willing to travel in Asia Pacific. Show more Show less
Posted 1 month ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Designation: “Simulation Engineer"-CFD We are seeking a highly skilled and experienced Simulation Design Engineer with a minimum of 1-2 years of expertise in CFD and Combustion. The ideal candidate will have a deep understanding of gas dynamics and shock interactions, aerodynamics, and possess excellent problem-solving skills to deliver high-quality simulations for various projects. Function as a solution specialist in Computational Fluid Dynamics using simulation tools such as ANSYS FLUENT, Chemkin & Star CCM+, ANSYS CFX. Mandatory Skillset/experience/knowledge Knowledge of Combustion Modeling – mandatory Strong fundamentals in CFD with Ansys Chemkin knowledge - Preferred Chemical (Preferred) (or) Aerospace (or) Mechanical with strong knowledge of Combustion physics Knowledge of gas dynamics and shock interactions - mandatory Knowledge of external aerodynamics – mandatory Basic knowledge of MATLAB for post-processing The Below Skills/knowledge Would Be Preferred Fluent dynamic and over set meshing Fluent 6DOF UDFs Fluent external Aero best practices Pressure based and density-based solver’s applicability along with pros & cons. Low subsonic to hyper-sonic flows. Most of the work is in 0.9 to 5 Mach Adaptive meshing Job submission on cluster (Pbs scripts) & RSM TUI journal utilization for automation of work Parameterization through space claim and fluent for simultaneous design point runs. Requirements 1-2 Years of relevant experience. Chemical/Aerospace Engineer having Ph.D. In Combustion modelling. Experience with ANSYS FLUENT/ STAR CCM+ or other commercial CFD software. Sound background in Combustion, Computational Fluid Dynamics [CFD], Aerodynamics, & AeroThermodynamics, Scram-jet Engine CFD Modeling, Propulsion, Liquid Propellant Rocket Engine Physics. Engaging personality, engineering curiosity and willingness for continuous learning. Possesses a sense of urgency, strong organizational and follow up skills. Benefits Challenging job within a young and dynamic team. Performance-driven, Career Progression Opportunities. Attractive remuneration package: On par with Industry Standards. Opportunity to join an organization experiencing year on year growth. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Pune
Work from Office
Looking for challenging role If you really want to make a difference - make it with us Can we energize society and fight climate change at the same time At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world’s energy systems. Their spirit fuels our mission. Substation Automation Engineer We are seeking a skilled Substation Automation Engineer with expertise in designing and implementing automation systems for electrical substations. The ideal candidate will be responsible for developing and integrating advanced technologies to enhance the monitoring, control, and protection of substation equipment. The candidate should possess a strong understanding of software systems, implementation, and maintenance of advanced automation systems for electrical substations. The successful candidate will be responsible for leveraging their knowledge and experience in substation automation design to enhance the operational efficiency and reliability of substations. This role requires a strong understanding of various technologies and software systems, as well as the ability to collaborate effectively with cross-functional teams and stakeholders. Your new role – challenging and future- oriented The design, engineering and implementation of substation automation systems to optimize monitoring, control, and protection functions within electrical substations. Responsible for preparation & approval of engineering document like FDS, System Architecture, Signal List, Bill of Materials, General Arrangements (internal & External), Panel heat load calculation. Preparation of Testing & Commissioning documents such as FAT, SAT, Test Reports, and Schedule Plans. Knowledge of PC Networking, switch, router, GPS, Firewall, Cyber Security Substation Automation Process Level, Bay Level, Control Level OS and ToolsWindows, PC Applications, PC communication, serial/ IP networking. Debugging tools Mods can, IEC Tester, IED Scout, Wireshark "Integrate intelligent electronic devices (IEDs), RTUs (Remote Terminal Units), Gateways, and human-machine interface (HMI) systems to enable remote monitoring and control capabilities. Develop and configure communication networks and protocols, such as IEC 61850, IEC60870-5-101/103/104, DNP3, OPC and Modbus TCI/IP to facilitate seamless data exchange and interoperability Knowledge of Process Bus Technology. Experience in IO Mapping with respect to the scheme drawings/troubleshoot of database issues in SAS system / RTU based system. (Know-how) Develop and maintain standardized parameterization procedures for substation automation equipment to optimize performance and interoperability Collaborate with engineering teams to ensure accurate and efficient parameterization of substation automation systems Stay updated with industry standards and technological advancements in substation automation design and implementation Work experience on Siemens SICAM (PAS, Toolbox, WinCC, SCC) solution, Hitachi Micro SCADA, GE DS AGILE) Planning and execution of training courses and customer presentations. Willingness to travel as the role demands a minimum of 50-70% travel overseas. We don’t need superheroes, just super minds Bachelor’s degree in electrical engineering or a related field 5 to 8 years of experience in substation automation design and implementation Proficiency in software systems such as SAS, RTU programming, HMI development, and networking protocols Strong knowledge of substation automation principles and communication protocols (e.g., IEC 61850, IEC-101/104 DNP3, Modbus TCP/IP) In-depth understanding of substation parameterization and additionally configuration for IEDs and control systems Experience in developing and implementing standardized parameterization procedures for substation automation equipment Soft Skills Excellent Communication (Both written & verbal) & Inter-personal Skills. Critical thinking, reasoning and problem solving are an essential part of this position. Enjoys learning new things and build knowledge base in new area. Enjoys working in Team environments, Exhibits collaboration in multiple stake holder interactions. Adaptive and eager to learn and work on new technologies. Innovative, self-driven, and disciplined. Strong analytical ability and problem-solving skills. Ability to manage multiple projects and prioritize tasks effectively We’ve got quite a lot to offer. How about you This role is based at Site (Pune). You’ll also get to visit other locations in India and beyond, so you’ll need to go where this journey takes you. In return, you’ll get the chance to work with teams impacting entire cities, countries – and the shape of things to come. We’re Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow.
Posted 1 month ago
6.0 years
0 Lacs
Hyderābād
On-site
Job Information Date Opened 06/16/2025 Job Type Full time Industry IT Services City BENGALURU,HYDERABAD State/Province BENGALURU Country India Zip/Postal Code BENGALURU Job Description As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to Conduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of ODI partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand source applications and how it can be integrated analyze data sets to understand functional and business context create Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status and risks to key stakeholders Lead the team to design, build, test and deploy Support client needs by delivering ODI jobs and frameworks Merge, Customize and Deploy ODI data model as per client business requirements Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients Focus on designing, building, and documenting re-usable code artifacts Track, report and optimize ODI jobs performance to meet client SLA Designing and architecting ODI projects including upgrade/migrations to cloud Design and implement security in ODI Identify risks and suggest mitigation plan Ability to lead the team and mentor junior practitioners Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts Perform system analysis, follow technical design and work on development activities Participate in design meetings, daily standups, backlog grooming Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans Validate the environment to meets all security and compliance controls Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions Ideally, you should also have Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premise to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability Experience in defining new architectures and ability to drive project from architecture standpoint Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills Strong communicator Willingness to travel in case of project requirement Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers
Posted 1 month ago
5.0 years
0 Lacs
Greater Bengaluru Area
Remote
Company Description Festo is a global family-owned company headquartered in Germany. For many years Festo has been providing innovations for factory automation and offers a wide product and service portfolio – from individual components to complex customized solutions and systems. As a family-owned company, we take responsibility for our actions globally and locally. We actively contribute to the quality of life and conservation of resources by majoring on cutting-edge technologies and knowledge as well as life-long learning. We are present in over 176 countries and collaborate in a network of over 15 development locations worldwide. Role Description We are looking for an experienced Senior Technical Product Support Engineer to join our global technical support team. In this role, you will leverage your extensive expertise in industrial engineering technologies, including electrical circuits, control panels, remote I/Os, fieldbus, PLCs, mechanical systems, and pneumatic technologies. Your primary responsibilities will include providing advanced technical support to customers, resolving complex issues related to Festo Electric Automation products, and delivering product training sessions. Responsibilities Provide remote technical support to customers for troubleshooting and commissioning electric and pneumatic automation systems. Identify and evaluate problems using Festo Electromechanical Components, electro pneumatic systems, and PLCs. Resolve technical issues related to product malfunction, incorrect installation, and wrong parameterization. Select appropriate components for Electric Automation & Pneumatic applications. Maintain communication with other Festo companies worldwide to clarify customer-specific problems and technical details. Actively participate in technical trainings, documentation, and knowledge sharing. Qualifications Degree in engineering in the field of mechatronics or equivalent Strong skills and experience in PLC programming (Festo Codesys, Siemens TIA Portal, Rockwell, ControlLogix, Beckhoff Twin cat). Good knowledge of industrial Ethernet fieldbus protocols (Profinet, EtherCAT, Ethernet/IP). Knowledge of electrical drives systems, remote IO systems, and pneumatics. Excellent written and verbal English communication. Proficiency in MS Office. Independent and responsible work ethic. Technical inclination towards new product launch and applications. Ability to work effectively in an international team. Language skills in German, Spanish, or Chinese are an advantage. Basic knowledge in modelling dynamic behavior of mechanic systems Very good English skills. Ability to work individually and in an international team. What we offer Challenging work on cutting-edge software technologies with a clear product focus Collaborate with our agile Indo-German team. Dynamic work environment with numerous personal development opportunities. Access to on-the-job and off-the-job learning opportunities. Flexible, hybrid working arrangements. Challenging work on cutting-edge software technologies with a clear product focus Collaborate with our agile Indo-German team. Dynamic work environment with numerous personal development opportunities. Access to on-the-job and off-the-job learning opportunities. Flexible, hybrid working arrangements. Job location: Bengaluru - Bommasandra, India Job type: Full-time Job level: Senior Experience: 5 years Show more Show less
Posted 1 month ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Designation: “Simulation Engineer"-CFD We are seeking a highly skilled and experienced Simulation Design Engineer with a minimum of 1-2 years of expertise in CFD and Combustion. The ideal candidate will have a deep understanding of gas dynamics and shock interactions, aerodynamics, and possess excellent problem-solving skills to deliver high-quality simulations for various projects. Function as a solution specialist in Computational Fluid Dynamics using simulation tools such as ANSYS FLUENT, Chemkin & Star CCM+, ANSYS CFX. Mandatory Skillset/experience/knowledge Knowledge of Combustion Modeling – mandatory Strong fundamentals in CFD with Ansys Chemkin knowledge - Preferred Chemical (Preferred) (or) Aerospace (or) Mechanical with strong knowledge of Combustion physics Knowledge of gas dynamics and shock interactions - mandatory Knowledge of external aerodynamics – mandatory Basic knowledge of MATLAB for post-processing The Below Skills/knowledge Would Be Preferred Fluent dynamic and over set meshing Fluent 6DOF UDFs Fluent external Aero best practices Pressure based and density-based solver’s applicability along with pros & cons. Low subsonic to hyper-sonic flows. Most of the work is in 0.9 to 5 Mach Adaptive meshing Job submission on cluster (Pbs scripts) & RSM TUI journal utilization for automation of work Parameterization through space claim and fluent for simultaneous design point runs. Requirements 1-2 Years of relevant experience. Chemical/Aerospace Engineer having Ph.D. In Combustion modelling. Experience with ANSYS FLUENT/ STAR CCM+ or other commercial CFD software. Sound background in Combustion, Computational Fluid Dynamics [CFD], Aerodynamics, & AeroThermodynamics, Scram-jet Engine CFD Modeling, Propulsion, Liquid Propellant Rocket Engine Physics. Engaging personality, engineering curiosity and willingness for continuous learning. Possesses a sense of urgency, strong organizational and follow up skills. Benefits Challenging job within a young and dynamic team. Performance-driven, Career Progression Opportunities. Attractive remuneration package: On par with Industry Standards. Opportunity to join an organization experiencing year on year growth. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Why join us? Our purpose is to design for the good of humankind. It’s the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone. Role: Associate Software Engineer Location: Bangalore Job Description We are looking to hire an Associate Software Engineer for the Data Engineering team, with excellent technical and communication skills, to effectively designing, developing, and validating data pipelines using cloud technologies while collaborating with IT and business stakeholders to understand their needs and develop functionality and enhancements. Associate Software Engineer in the Data Engineering team as part of day-to-day operations, will monitor production data pipelines to ensure timely and successful execution. This role transcends organizational and geographical boundaries asit aims at supporting and enabling the various divisions of the Herman Miller business across the globe. The ideal candidate should understand the software development lifecycle and use agile methodology to design, develop, test, and implement solutions that deliver on end-user needs. Responsibilities Translate stakeholder deliverables into the project methodology deliverables that result in design, such as functional specifications, use cases, workflow & process diagrams, data flow & data model diagrams. ▪ ETL design, development and support of data pipelines used for Analytics. ▪ Monitoring of the ETL jobs over the weekend and providing support in case of any issues. ▪ Perform table and interface design, development of logical and physical data model/database and ETL processes while meeting business/user requirements. ▪ Design, develop, deploy, and validate extremely reliable, scalable, and high performing data pipelines. ▪ Ensure quality of the data and code using quality tools and frequent code reviews. ▪ Assist in the development of logical and physical data models for designing/developing Business Intelligence & Data Warehouse requirements. ▪ Ensure proper source code management of data pipelines, stored procedures, and database definitions. ▪ Provide accurate documentation of data pipelines. ▪ Disseminate tasks amongst team members and ensure timely/accurate completion as defined by the business. ▪ Coordinate prioritized work and help to mitigate any blockers for the team. ▪ Efficient communication of known issues/problems with team members as well as appropriate escalation to leadership. Essential experience ▪ Demonstrated experience in using a graphical ELT/ETL pipeline development. ▪ Experience with data analysis, data modeling and data warehousing is a must. ▪ Ability to troubleshoot technical and functional problems with intuitive problem-solving techniques with little supervision or direction. ▪ Experience in optimizing databases, understanding of referential integrity and the use of indexes. ▪ Possess an extremely sound understanding of columnar database. ▪ Strong SQL knowledge and experience writing queries and DML. ▪ Moderate knowledge of cloud-based tools and how to interact with them. Ideal candidate A graduate / post-graduate in computer science / technology / engineering or mathematics. Having excellent interpersonal and communication skills in English, both written and verbal. At least 3+ years of hands-on experience with Data Engineering expertise using ELT/ETL tools to meet business requirements for analytics/reporting. Strong experience on parameterization and process optimization. Experience using Matillion, AWS, Python and Snowflake to develop robust data pipelines. Ability to work individually or within a team environment. Ability to participate in multiple projects & tasks and priorities in a healthy work environment. Has attention to detail and a penchant for quality. Proficient in documentation as well as process and workflow design. Command of version control methodologies, preferably with GIT / GITHub. Ability to take direction, constructive criticism, and work to specified deadlines. Adhere to process and procedures defined for the role, the team, and the organization. This role requires the individual to work in the extended UK shift (2:00 P.M. – 11:00 P.M.) India time) and during critical issues, releases, updates, migration or other urgent business needs, there could be a requirement to work during US business hours (5:30 P.M. to 2:30 A.M.) Who We Hire? Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We’re committed to equal opportunity employment, including veterans and people with disabilities. MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_help@millerknoll.com. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gajuwaka, Andhra Pradesh, India
On-site
Machine Building Activity PLC, SCADA, HMI, Custom Screen Developments. PLC, HMI, VFD Parameterization, Commissioning and prooveouts. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Jakarta, Indonesia
On-site
Company Description Quantyc.ai is a Software and IT Consultancy provider with offices in Mumbai and Jakarta, specializing in Analytics, Predictive, and Data-backed decision support solutions across multiple industries. The company implements world-leading products in Private Wealth, Lending, and Digital Banking with an emphasis on global best practices and state-of-the-art technology. Role Description Design and develop JMeter test scripts for load, stress, and endurance testing. Simulate real-world user behavior for Wealth Management applications (e.g., portfolio dashboards, transaction systems) are a plus. Integrate JMeter with CI/CD pipelines (e.g., Jenkins, GitLab). Analyze test results and identify performance bottlenecks. Collaborate with developers, DevOps, and infrastructure teams to optimize system performance. Generate detailed performance reports and dashboards. Required Skills Expertise in Apache JMeter: scripting, parameterization, correlation, assertions, and listeners. Experience with distributed load testing execution. Experience in Banking is a must Familiarity with Wealth Management platforms and financial transaction flows. Strong knowledge of HTTP/S, REST APIs, JSON/XML, and SQL. Experience with monitoring tools (e.g., Dynatrace, Elastic). Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. o 2–5+ years of experience in performance testing, with at least 2 years using JMeter. ISTQB or performance testing certifications are a plus. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Altera, a member of the N. Harris Computer Corporation family, delivers health IT solutions that support caregivers around the world. These include the Sunrise™, Paragon®, Altera TouchWorks®, Altera Opal, STAR™, HealthQuest™ and dbMotion™ solutions. At the intersection of technology and the human experience, Altera Digital Health is driving a new era of healthcare, in which innovation and expertise can elevate care delivery and inspire healthier communities across the globe. A new age in healthcare technology has just begun. Responsibilities Data Conversion and Interface Demonstrate proficiency with Interface or data conversion development and implementation tools used by the team. Negotiate Interface or data conversion specifications and plans for Implementation with customer/vendors for those that deviate from standard. Install software and interface definition on customer's servers Create test plan templates and test with customer/vendor to insure that functionality meets the needs of the customer Provide interface and related workflow solutions within the guidelines of the interface team's custom request procedures. Development, deployment, and testing of configuration release packages based on customer requirements. Assist the customer with the parameterization and configuration necessary to deliver the functionality defined in the contract scope. Provide technical assistance to other departments in relation to interface development and installation. Provide 3rd level interface support Technical Consultant Demonstrate proficiency in the delivery of hardware and system software technology to customers using standard procedures. Provide technical support during the implementation delivery process including triaging and troubleshooting reported by customer and implementation team. Advise clients on testing strategy and planning to ensure successful solution implementation Install all software required for the Altera solutions Carry out the implementation of Altera solutions and designated 3rd party products following a detailed project plan and in accordance with the contracted project scope. Developing Technical documentation to facilitate efficiency, document customer implementations, and support department education objectives. Training Customers and Altera Staff on your areas of expertise Qualifications Academic and professional qualifications: Bachelor's Degree in relevant technical field, (Information Technology, Computer Science, Software, Engineering, or related technical discipline), or equivalent combination of training and work experience. Experience : 2-4 or more years in related responsibilities depending on education level. Technical Qualifications: Must be willing to work from Office Must have excellent communication skills Must be willing to work in shifts Excellent technical capabilities - ability to understand, configure and troubleshooting complex systems. Creative thinker, problem solving approach Strong analytical abilities Data Modeling Ability to write SQL queries and understand complex database structures Knowledge of code\scripting languages Healthcare standards – deep understanding of HL7, CCDAs and/or XML and how to interpret and read both languages Familiar clinical terminology standards such as (SNOMED, LOINC, ICD, etc) Altera is an Equal Opportunity/Affirmative Action Employer. We consider applicants without regard to race, color, religion, age, national origin, ancestry, ethnicity, gender, gender identity, gender expression, sexual orientation, marital status, veteran status, disability, genetic information, citizenship status, or membership in any other group protected by local law. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at: HR.Recruiting@AlteraHealth.com Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
We're Hiring(Immediate Joiners Only) QA Automation Engineer (Remote India). Position Type : Full-time. Experience Level : 4+ Years. Location : Remote (India). Notice Period : Immediate Joiner (Can join within a week). About The Role We are in search of a talented and motivated QA Automation Engineer who can take ownership of testing initiatives across web and API layers. The ideal candidate will have 4+ years of experience in frontend and backend test automation, with a passion for creating scalable testing frameworks from scratch and ensuring the highest product quality. You should be strong in logic building, test strategy, and be able to thrive in a remote, fast-paced environment. Required Skills & Tools Strong hands-on experience with Playwright automation tool using Python or Java. Working experience with Selenium for browser-based automation. Proficient in Postman for manual and automated RESTful API testing. In-depth knowledge and implementation of Page Object Model (POM) design pattern. Solid logic building and problem-solving skills to handle complex test scenarios and framework architecture. Strong knowledge of test scripting, assertion logic, data-driven testing, and parameterization. Familiarity with version control systems like Git and CI/CD tools such as Jenkins/GitHub Actions. Experience with Typescript and Design and build scalable, reusable, and maintainable automation frameworks for frontend and API testing. Write clean and efficient test automation scripts using Playwright with Python or Java. Develop, maintain, and execute detailed test cases and test scripts. Perform functional, regression, smoke, and integration testing on web and APIs. Analyze test results, debug issues, and collaborate with developers for resolution. Actively participate in sprint planning, estimations, and daily stand-ups in an Agile/Scrum environment. Ensure test coverage, traceability, and contribute to continuous test improvement. Youll Thrive If You. Take ownership of your work and deliverables. Love building high-quality solutions with an emphasis on customer experience. Have a strong attention to detail and a passion for automation. Communicate clearly and collaborate well in a remote team. Enjoy problem-solving and are quick to adapt to new tools and challenges. Nice To Have Experience with additional frameworks or tools like Cypress, Robot Framework, or TestNG. Exposure to BDD frameworks like Cucumber. Knowledge of performance testing tools (e.g., JMeter). Experience with containerized environments (Docker, Kubernetes). Familiarity with cloud-based testing platforms (e.g., BrowserStack, Sauce Labs). (ref:hirist.tech) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose : Core Banking Data Migration Consultants with expertise and prior work experience in FIS Core Banking product (MBP /Profile 7) & Modules Your client responsibilities : Need to work as a team lead in one or more Core Banking Implementation projects Excellent business communication skills Should be able to drive and lead the DRG Should be ready to travel to customers locations on need basis (after pandemic) Should exhibit the deep experience in Banking during the client discussions and be able to convince the client for the solution Your People Responsibilities Building a quality culture Manage the performance management for the direct repartees, as per the organization policies Training and mentoring of project resources Participating in the organization-wide people initiatives Mandatory skills : Hands on experience of FIS Conversion tool / Similar ETL tools used for data migration Hands on experience in Data model, Table structure and mapping exercise for FIS transformation project Should be able to write PL/ SQL queries in Oracle, My SQL databases Well versed in Functional aspects of FIS Core Banking Solution and able to play the Data Migration Analyst role with experience in mapping the field between old and new system. Exposure in conducting or participating in product demonstration, training and assessment studies. Good understanding of the Platform architecture, administration, configuration, data structure Should be experienced in data migration strategy document creation, planning , Testing ,end to end execution Should be able to provide Out of the Box(OOTB) solutions and be able to provide customization approach as well. Should be able to handle parameterization of Finacle core for the respective modules individually Exposure to cloud environment is preferred Should be ready to travel to customers locations on need basis (after pandemic) Should be able to review the test cases and guide the testing team on need basis Should have 2 or more end to end FIS core banking implementation experience EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose : Core Banking Data Migration Consultants with expertise and prior work experience in FIS Core Banking product (MBP /Profile 7) & Modules Your client responsibilities : Need to work as a team lead in one or more Core Banking Implementation projects Excellent business communication skills Should be able to drive and lead the DRG Should be ready to travel to customers locations on need basis (after pandemic) Should exhibit the deep experience in Banking during the client discussions and be able to convince the client for the solution Your People Responsibilities Building a quality culture Manage the performance management for the direct repartees, as per the organization policies Training and mentoring of project resources Participating in the organization-wide people initiatives Mandatory skills : Hands on experience of FIS Conversion tool / Similar ETL tools used for data migration Hands on experience in Data model, Table structure and mapping exercise for FIS transformation project Should be able to write PL/ SQL queries in Oracle, My SQL databases Well versed in Functional aspects of FIS Core Banking Solution and able to play the Data Migration Analyst role with experience in mapping the field between old and new system. Exposure in conducting or participating in product demonstration, training and assessment studies. Good understanding of the Platform architecture, administration, configuration, data structure Should be experienced in data migration strategy document creation, planning , Testing ,end to end execution Should be able to provide Out of the Box(OOTB) solutions and be able to provide customization approach as well. Should be able to handle parameterization of Finacle core for the respective modules individually Exposure to cloud environment is preferred Should be ready to travel to customers locations on need basis (after pandemic) Should be able to review the test cases and guide the testing team on need basis Should have 2 or more end to end FIS core banking implementation experience EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough