Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
You will be working as an Informatica BDM professional at PibyThree Consulting Pvt Ltd. in Pune, Maharashtra. PibyThree is a global cloud consulting and services provider, focusing on Cloud Transformation, Cloud FinOps, IT Automation, Application Modernization, and Data & Analytics. The company's goal is to help businesses succeed by leveraging technology for automation and increased productivity. Your responsibilities will include: - Having a minimum of 4+ years of development and design experience in Informatica Big Data Management - Demonstrating excellent SQL skills - Working hands-on with HDFS, HiveQL, BDM Informatica, Spark, HBase, Impala, and other big data technologies - Designing and developing BDM mappings in Hive mode for large volumes of INSERT/UPDATE - Creating complex ETL mappings using various transformations such as Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, Lookups, Filters, Sequence, Router, and Update Strategy - Ability to debug Informatica and utilize tools like Sqoop and Kafka This is a full-time position that requires you to work in-person during day shifts. The preferred education qualification is a Bachelor's degree, and the preferred experience includes a total of 4 years of work experience with 2 years specifically in Informatica BDM.,
Posted 5 days ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
As a Manager within SAP Consulting services at PwC, your role will involve working as part of a team of problem solvers, helping to address complex business issues from strategy to execution. With a focus on maximizing the value of clients" SAP investment, you will provide comprehensive consulting, system integration, and implementation services across multiple SAP applications, products, and technologies. Your primary responsibilities will include strong experience in S/4HANA Brownfield conversion projects, including conducting pre-conversion readiness checks, resolving system simplification items, and managing data migration objects related to MM. You should be adept at leveraging SAP Readiness Check, Simplification Item (SI), and Fiori activation relevant to SD scenarios during conversion projects. Additionally, you will need the ability to address module-specific functional and technical requirements during the Brownfield migration journey, including functional delta handling, testing of legacy configurations in S/4, and adoption of new S/4HANA innovations. In this role, you will be responsible for preparing and maintaining the Data Migration plan, overseeing Risk Planning, Mitigation, Issue Log, and Publishing Status Reports. You will conduct Business Blueprint data migration workshops, manage the identification, extraction, cleaning, mapping, and loading of both master and transactional data into the SAP S4hana system, and ensure data integrity and quality throughout the migration process. Furthermore, you will work closely with functional consultants to verify data and troubleshoot any issues that arise during the migration. It will be essential to design and manage ETL processes, including the development of load programs using tools like LSMW, LTMC, and others. You will also be required to coordinate pre-conversion activities such as custom code analysis, simplification item checks, and data cleanup, ensuring the proper handling of custom developments, interfaces, and third-party integrations. Your experience should include a minimum of 7+ years of relevant Brownfield SAP experience, with significant involvement in SAP ECC to S/4HANA migration projects. You should have hands-on experience with S/4 Brownfield migration, the ability to tackle functional mandatory items during migration, and proven experience in leading Brownfield system conversion using SAP's recommended tools and methodologies. Knowledge of SAP ECC and S/4HANA architecture, data structures, and technical components, as well as an understanding of functional modules such as FI, CO, SD, MM, PP, and their cross-module impacts, will be crucial for success in this role. The ideal candidate will possess a Bachelor's degree or equivalent, along with 9 to 12 years of experience in the field. Any graduation or post-graduation in a relevant discipline will be considered beneficial. Travel to client locations may be required as per project requirements, and the role is based in Bangalore, Hyderabad, Mumbai, or Kolkata. If you are passionate about leveraging your expertise in SAP consulting to drive successful Brownfield conversion projects and enhance procurement and supply chain efficiency through automation solutions and AI tools, we invite you to apply to join our team at PwC. At PwC, we care for our people and provide a high-performance culture focused on diversity, inclusion, and continuous learning and development, making it one of the best places to work, learn, and excel in your career. Apply now and be a part of a team that connects people with diverse backgrounds and skill sets to solve important problems and lead with purpose for our clients, our communities, and the world at large.,
Posted 5 days ago
5.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have 5-12 years of experience in Big Data & Data related technologies, with expertise in distributed computing principles. Your skills should include an expert level understanding of Apache Spark and hands-on programming with Python. Proficiency in Hadoop v2, Map Reduce, HDFS, and Sqoop is required. Experience in building stream-processing systems using technologies like Apache Storm or Spark-Streaming, as well as working with messaging systems such as Kafka or RabbitMQ, will be beneficial. A good understanding of Big Data querying tools like Hive and Impala, along with integration of data from multiple sources including RDBMS, ERP, and Files, is necessary. You should possess knowledge of SQL queries, joins, stored procedures, and relational schemas. Experience with NoSQL databases like HBase, Cassandra, and MongoDB, along with ETL techniques and frameworks, is expected. Performance tuning of Spark Jobs and familiarity with native Cloud data services like AWS or AZURE Databricks is essential. The role requires the ability to efficiently lead a team, design and implement Big data solutions, and work as a practitioner of AGILE methodology. This position falls under the category of Data Engineer and is suitable for individuals with expertise in ML/AI Engineers, Data Scientists, and Software Engineers.,
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer specializing in Snowflake Migration at Anblicks, you will be a key player in our Data Modernization Center of Excellence (COE). You will be at the forefront of transforming traditional data platforms by utilizing Snowflake, cloud-native tools, and intelligent automation to help enterprises unlock the power of the cloud. Your primary responsibility will be to lead the migration of legacy data warehouses such as Teradata, Netezza, Oracle, or SQL Server to Snowflake. You will re-engineer and modernize ETL pipelines using cloud-native tools and frameworks like DBT, Snowflake Tasks, Streams, and Snowpark. Additionally, you will design robust ELT pipelines on Snowflake that ensure high performance, scalability, and cost optimization, while integrating Snowflake with AWS, Azure, or GCP. In this role, you will also focus on implementing secure and compliant architectures with RBAC, masking policies, Unity Catalog, and SSO. Automation of repeatable tasks, ensuring data quality and parity between source and target systems, and mentoring junior engineers will be essential aspects of your responsibilities. Collaboration with client stakeholders, architects, and delivery teams to define migration strategies, as well as presenting solutions and roadmaps to technical and business leaders, will also be part of your role. To qualify for this position, you should have at least 6 years of experience in Data Engineering or Data Warehousing, with a minimum of 3 years of hands-on experience in Snowflake design and development. Strong expertise in migrating ETL pipelines from Talend and/or Informatica to cloud-native alternatives, proficiency in SQL, data modeling, ELT design, and pipeline performance tuning are prerequisites. Familiarity with tools like DBT Cloud, Airflow, Snowflake Tasks, or similar orchestrators, as well as a solid understanding of cloud data architecture, security frameworks, and data governance, are also required. Preferred qualifications include Snowflake certifications (SnowPro Core and/or SnowPro Advanced Architect), experience with custom migration tools, metadata-driven pipelines, or LLM-based code conversion, familiarity with domain-specific architectures in Retail, Healthcare, or Manufacturing, and prior experience in a COE or modernization-focused consulting environment. By joining Anblicks as a Lead Data Engineer, you will have the opportunity to lead enterprise-wide data modernization programs, tackle complex real-world challenges, and work alongside certified Snowflake architects, cloud engineers, and innovation teams. You will also have the chance to build reusable IP that scales across clients and industries, while experiencing accelerated career growth in the dynamic Data & AI landscape.,
Posted 5 days ago
3.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Apache Airflow Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Airflow, Python (Programming Language). - Strong understanding of data integration and ETL processes. - Experience with cloud-based data solutions and architectures. - Familiarity with data governance and management best practices. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Kolkata office. - A 15 years full time education is required., 15 years full time education
Posted 5 days ago
10.0 - 12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
JR: R00208204 Experience: 10-12Years Educational Qualification: Any Degree Job Title - S&C- Data and AI - CFO&EV – Quantexa Platform(Assoc Manager) Management Level: 8-Associate Manager Location: Pune, PDC2C Must-have skills: Quantexa Platform Good to have skills: Experience in financial modeling, valuation techniques, and deal structuring. Job Summary: This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHAT’S IN IT FOR YOU? Accenture CFO & EV team under Data & AI team has comprehensive suite of capabilities in Risk, Fraud, Financial crime, and Finance. Within risk realm, our focus revolves around the model development, model validation, and auditing of models. Additionally, our work extends to ongoing performance evaluation, vigilant monitoring, meticulous governance, and thorough documentation of models. Get to work with top financial clients globally Access resources enabling you to utilize cutting-edge technologies, fostering innovation with the world’s most recognizable companies. Accenture will continually invest in your learning and growth and will support you in expanding your knowledge. You’ll be part of a diverse and vibrant team collaborating with talented individuals from various backgrounds and disciplines continually pushing the boundaries of business capabilities, fostering an environment of innovation. What You Would Do In This Role Engagement Execution Lead client engagements that may involve model development, validation, governance, strategy, transformation, implementation and end-to-end delivery of fraud analytics/management solutions for Accenture’s clients. Advise clients on a wide range of Fraud Management/ Analytics initiatives. Projects may involve Fraud Management advisory work for CXOs, etc. to achieve a variety of business and operational outcomes. Develop and frame Proof of Concept for key clients, where applicable Practice Enablement Mentor, groom and counsel analysts and consultants. Support development of the Practice by driving innovations, initiatives. Develop thought capital and disseminate information around current and emerging trends in Fraud Analytics and Management Support efforts of sales team to identify and win potential opportunities by assisting with RFPs, RFI. Assist in designing POVs, GTM collateral. Travel: Willingness to travel up to 40% of the time Professional Development Skills: Project Dependent Professional & Technical Skills: Relevant experience in the required domain. Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced, dynamic environment. Advanced skills in development and validation of fraud analytics models, strategies, visualizations. Understanding of new/ evolving methodologies/tools/technologies in the Fraud management space. Expertise in one or more domain/industry including regulations, frameworks etc. Experience in building models using AI/ML methodologies Modeling: Experience in one or more of analytical tools such as SAS, R, Python, SQL, etc. Knowledge of data processes, ETL and tools/ vendor products such as VISA AA, FICO Falcon, EWS, RSA, IBM Trusteer, SAS AML, Quantexa, Ripjar, Actimize etc. Proven experience in one of data engineering, data governance, data science roles Experience in Generative AI or Central / Supervisory banking is a plus. Strong conceptual knowledge and practical experience in the Development, Validation and Deployment of ML/AL models Hands-on programming experience with any of the analytics tools and visualization tools (Python, R, PySpark, SAS, SQL, PowerBI/ Tableau) Knowledge of big data, ML ops and cloud platforms (Azure/GCP/AWS) Strong written and oral communication skills Project management skills and the ability to manage multiple tasks concurrently Strong delivery experience of short and long term analytics projects Additional Information: Opportunity to work on innovative projects. Career growth and leadership exposure. About Our Company | Accenture
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417810 Relocation Package Yes
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417809 Relocation Package Yes
Posted 5 days ago
4.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417808 Relocation Package Yes
Posted 5 days ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Brightly Software Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and sustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. About The Job The Vice President of Software Engineering for our SaaS platform is a pivotal leadership role, responsible for overseeing the entire platform engineering function from strategy to execution, and team management to culture. This position is crucial for guiding the technical execution of our mission. It demands technical expertise, strategic thinking, and strong leadership skills to drive the development and delivery of a high-quality, scalable SaaS product. As a key member of the company’s Senior Leadership Team (SLT), the VP of Software Engineering actively contributes to the overall technology strategy, vision, and execution of Brightly products as we expand and grow across our core verticals in Education, Government, Healthcare, and Manufacturing. This leader is accountable for the planning, organization, and execution of all platform engineering activities, including directing engineering efforts to meet customer needs and providing ongoing technology product development. The role is part of the Technology organization, reporting directly to the Chief Technology Officer. What You Will Do Define and implement the overarching platform engineering strategy to align with the company’s product vision and business objectives. Lead and guide the Platform Engineering organization in executing global R&D roadmap priorities. Build, mentor, and sustain a high-performing engineering team by promoting a collaborative and innovative work environment. Collaborate closely with product management to translate product vision into actionable roadmaps. Oversee the development and operation of a highly reliable SaaS platform, ensuring optimal uptime and performance. Drive initiatives for product transformation, including migrating and building new products on the platform. Partner with product, design, sales, customer success, and marketing teams to ensure alignment and successful product launches. Make informed technical decisions daily, balancing engineering, business, and customer considerations. Foster a data-driven environment focused on continuous improvement to enhance efficiency, performance, scalability, stability, and security. Ensure adherence to technology standards and procedures as established by Engineering leadership. Deliver engineering excellence in execution, technical vision, and direction, ensuring alignment with overall company strategy. Advocate for the continued adoption of Agile methodologies, including daily stand-ups, JIRA management, and sprint planning, as part of our Brightly Way of Working. Regularly communicate with stakeholders regarding the status of the platform engineering team, highlighting achievements and identifying areas requiring additional support. What You Need BS in Computer Science or other relevant engineering field; MBA a plus. 10+ years minimum of software engineering experience on a SaaS solution, platform, marketplace, or other product-led in an agile environment, and 15+ years of experience overall Deep understanding of applications, API, accessibility best practices, Infrastructure-as-Code principles, containers, serverless computing and cloud environments Strong analytical and problem-solving skills, with meticulous attention to detail. Strong understanding of cost dynamics for software and infrastructure Expertise in technologies (i.e. Java, .NET, RedHat, AWS, ETL tools, front-end technologies, Github Copolit, Jira) Leadership experience of growing and managing teams of 100+ individuals with varying experience levels and skillsets Experience with technologies in operations management, EdTech, GovTech, Manufacturing, Healthcare and Data Analytics software a plus. Experience coaching, mentoring, building and maintaining an Engineering talent pipeline. Travel Requirements: Approximately 20% What Makes You a Standout A pragmatic decision-maker, with the ability to prioritize tasks in the context of business objectives, technology constraints, and dependencies Confident leader, not driven by ego, to have both the conviction and willingness to make decisions as well as the confidence to seek collaborative solutions Results-oriented team player who leads by example, holds self-accountable for performance, takes ownership, and champions efforts with enthusiasm and conviction. The Brightly culture We’re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiously making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.
Posted 5 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Why join Safeguard Global? We want to help you “Work in Any Way ” - that makes time for family, commitments, and life outside, so that you can have the best of both worlds. When you own what you do and are driven to deliver, you have the flexibility to decide where and how you work. The role in a nutshell: The Workday Analyst/Administrator will ensure compliance in HR processes. This position will assist in the configuration and go-live of our new Workday HCM System and also do the integration of Workday applications for our current team transitioning to Workday and ensuring that the value of the system is maximized. Following implementation, the Analyst/Administrator will administer the Workday system, provide functional support; update the system to meet our needs, monitor end user usage and perform daily administrative tasks. How you will make a difference: Provide a proactive interface between HR client groups and Workday to ensure effective coordination and delivery of Workday implementations Map business requirements to Workday, drives business process design including technical requirements, configuration, testing, documentation and change management alignment Identify client enhancement opportunities for existing and new features introduced across new Workday releases Support all components of implementations including assembling data gathering workbooks as well as system testing and UAT Plan and direct the post-implementation and ongoing administration of Workday HCM Test business processes and Workday configuration to ensure maximum system output Ensure data and system integrity Provide training to end-users, and helping to develop and maintain Workday training materials, standard operating procedures, and provide user support services. Support end-users in their use of different Workday applications. Assist in the review, testing, validation, and implementation of new modules and functionality Write, maintain, and support a variety of reports and queries – both standard and custom Research and resolve HCM problems, unexpected results, and process flaws Troubleshoot workday configuration and reporting issues to identify and fix root causes Develop user procedures, guidelines, and documentation Monitor Workday weekly releases, assess and test semi-annual updates to ensure continued data integrity and business process operation. Participate in the development and delivery of new Workday features and functionality. Collaborate with other organizations within Workday’s Community to share knowledge and create and/or adhere to industry best practices. What will give you an advantage: Prior experience implementing or maintaining Workday HCM systems Ability to model and configure workflows & systems that recognize both complexity and user experience. Knowledge of or ability to learn Workday integration tools (EIB, Workday Studio, Workday Report Writer) and ETL processes and tools Ability to work with XML, Java, and Web Services based integrations. Working knowledge and language of business areas including Human Resources, Benefits, Compensation, and Talent Acquisition strongly preferred Strong interpersonal, communication, and customer service skills and the ability to interact with users at all levels. Demonstrated ability to work in a team environment and ability to both accept and provide guidance and direction as necessary. Strong business English, including ability to explain technical concepts in understandable non-technical fashion Strong customer service ability and commitment to Safeguard values Who we are and what we do: Safeguard Global is….Global! With offices worldwide, we help 1500+ companies hire, manage, and pay employees in 170+ countries. It's all about people! Join us to meet diverse folks, explore new cultures, and connect with amazing folks from around the globe. Our Global Benefits Autonomy & Flexibility (Work in Any Way): Be supported with as much flexibility as possible. Bonding Leave: Enjoy paid leave to bond with your new family member. 2 Charitable Days: Contribute to causes you believe in. Reward & Recognition Program: Be rewarded for your success and championing our values. Corporate bonus/SIP: All Guardians are eligible for our annual bonus scheme or sales incentive plan. Why become a Guardian: International Environment: Grow your network internationally and collaborate across the world. Interact, discover cultures, and tap into local expertise. Our Culture: We emphasize the people factor in everything we do. Our nurturing culture ensures your ideas reach our leaders and your contributions get the recognition they deserve. Learning: We support your continuous growth by providing access to 2 learning platforms, where you can learn at your own pace. Next Steps: To apply, please click on the following link. We wish you the best with your application. Our Guardian promise to you is to keep in touch to arrange the next stage should your application meet the position's requirements, and or a gentle update if you have been unsuccessful at this time. Welcome to the Future of Work! At Safeguard Global, we are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and Guardians.
Posted 5 days ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Strong understanding of data modeling and ETL processes. - Experience with SQL and database management. - Familiarity with cloud computing concepts and services. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 3 years of experience in Snowflake Data Warehouse. - This position is based at our Mumbai office. - A 15 years full time education is required.
Posted 5 days ago
5.0 years
0 Lacs
India
Remote
As a Sales Engineer , you will gain an in-depth understanding of Silent Eight's product offerings and roadmap while collaborating closely with sales directors to present tailored solutions to clients. Your responsibilities include delivering technical presentations and demonstrations, establishing trusted relationships with clients, and providing ongoing technical support to drive sales efforts. Additionally, you will play a key role in guiding customers through the installation process and ensuring a seamless transition from pre-sales to post-sales activities. Key Duties Work closely with the revenue team to understand customer needs and provide technical expertise during the sales process. Support remote and in-person meetings to discuss business and technical fit with Silent Eight’s solutions. Build relationships with customers as a trusted technical advisor and subject matter specialist. Engage with customers to assess their pain points, encourage conversation, understand current workflows, and provide recommendations on leveraging Silent Eights products effectively. Deliver compelling presentations and demonstrations showcasing product capabilities, use cases, and value propositions. Ability to tell the story of how Silent Eight’s solutions will work for a customer, instilling confidence. Help create technical proposals, RFP/I responses, and supporting documentation for sales opportunities. Assist in setting up and running PoVs to validate the effectiveness of the solution for potential clients. Ensure a smooth handover to implementation teams, providing clear and concise SOW’s and assisting with customer onboarding activities. Stay updated on market trends, competitors, and emerging technologies to better position solutions. Educate both internal teams and customers on product features, best practices, and implementation strategies. Collaborate effectively with product and engineering teams to provide valuable customer feedback and recommendations for new features. Assist marketing and business development by attending and presenting at events and webinars, promoting Silent Eight’s solutions and thought leadership on social media, and contributing to content creation while reviewing existing materials to ensure relevance and impact. Qualifications: At least 5 years of experience as a Pre-Sales Engineer in the AML space or a similar role. Experience working with AML solutions and familiarity with wider AML use cases. Experience selling enterprise software, specifically to Tier 1 banks and large-scale financial institutions. Strong practical understanding of SaaS, on-premise and cloud architectures and how modern software is built, tested and deployed. The ideal candidate will have familiarity with key methodologies for integrating and deploying mission-critical solutions for customers. This includes expertise in REST API integration, databases, and ETL methodologies. Knowledge of encryption, data storage, high availability (HA) architectures, identity and access management (IAM) tools, as well as knowledge about scripting and automation (CI/CD). Experience working with AI technologies, including the integration and application of machine learning models, natural language processing (NLP), and data-driven AI solutions. Problem-Solving Ability, skilled in identifying customer challenges and designing tailored software solutions that meet their unique needs. Strong communication and interpersonal skills. Ability to explain technical solutions and concepts to non-technical people.
Posted 5 days ago
5.0 years
0 Lacs
India
On-site
Amex GBT is a place where colleagues find inspiration in travel as a force for good and – through their work – can make an impact on our industry. We’re here to help our colleagues achieve success and offer an inclusive and collaborative culture where your voice is valued. We're looking for a skilled Tableau Business Intelligence Developer to elevate our data visualization and reporting capabilities. What You'll Do: Design and develop best-in-class visualizations and interactive dishoards using Tableau Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. Ensure timely delivery of high-quality reporting solutions that meet business goals. Work across multiple products or initiatives simultaneously with strong time management. Apply data warehousing principles and best practices in designing scalable and optimized BI solutions. Contribute to and support enterprise reporting strategies. Partner with data architects, developers, and analysts to create best-in-class dashboards and reports. Write and understanding complex SQL queries and proficiency in Python is a Plus. Troubleshooting and resolving performance issues in BI Reports and Dashboards Maintain clear and thorough documentation for Report specifications, and design logic. Log all development activities and maintain sprint related tasks and work logs using Jira. Participate in sprint planning, and retrospectives in an Agile development environment. Handle sensitive data responsibly and maintain confidentiality. Communicate effectively with technical and non-technical stakeholders through strong verbal and written communication skills. What We're Looking For: 5+ years of experience with Tableau and other BI tools (e.g., Power BI, and others) Proficient in SQL and Python scripting. Strong understanding of data modelling, ETL processes, and Data warehousing concepts. Experience working in Agile environments with hands-on use of Jira for tracking sprints and logging work. Knowledge of enterprise-level BI architecture is a plus Excellent attention to detail and focus on visual design and user experience. Strong analytical, troubleshooting, and problem-solving abilities. Ability to handle multiple priorities in a fast-paced, deadline-driven environment. Location India The #TeamGBT Experience Work and life: Find your happy medium at Amex GBT. Flexible benefits are tailored to each country and start the day you do. These include health and welfare insurance plans, retirement programs, parental leave, adoption assistance, and wellbeing resources to support you and your immediate family. Travel perks: get a choice of deals each week from major travel providers on everything from flights to hotels to cruises and car rentals. Develop the skills you want when the time is right for you, with access to over 20,000 courses on our learning platform, leadership courses, and new job openings available to internal candidates first. We strive to champion Inclusion in every aspect of our business at Amex GBT. You can connect with colleagues through our global INclusion Groups, centered around common identities or initiatives, to discuss challenges, obstacles, achievements, and drive company awareness and action. And much more! All applicants will receive equal consideration for employment without regard to age, sex, gender (and characteristics related to sex and gender), pregnancy (and related medical conditions), race, color, citizenship, religion, disability, or any other class or characteristic protected by law. Click Here for Additional Disclosures in Accordance with the LA County Fair Chance Ordinance. Furthermore, we are committed to providing reasonable accommodation to qualified individuals with disabilities. Please let your recruiter know if you need an accommodation at any point during the hiring process. For details regarding how we protect your data, please consult the Amex GBT Recruitment Privacy Statement. What if I don’t meet every requirement? If you’re passionate about our mission and believe you’d be a phenomenal addition to our team, don’t worry about “checking every box;" please apply anyway. You may be exactly the person we’re looking for!
Posted 5 days ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Senior Software Engineer Group Description Data Solutions - Platforms and Environments is the industry leading content delivery platform. Clients seamlessly access organized and connected content that is easily discoverable, explorable, and procured via the FactSet Marketplace. Data is delivered via a variety of technologies and formats that meet the needs of our clients’ workflows. By enabling our clients to utilize their preferred choice of industry standard databases, programing languages, and data visualization tools, we empower them to focus on the core competencies needed to drive their business. The Data Solutions - Platforms and Environments solutions portfolio includes Standard DataFeed, Data Exploration, OnDemand (API), Views, Cornerstone, Exchange DataFeed, Benchmark Feeds, the Open:FactSet Marketplace, DataDictionary , Navigator and other non-workstation initiatives. Job Description The Data Solutions - Platforms and Environments team is looking for a talented, highly motivated Senior Software Engineer (Full Stack Engineer) to join our Navigator Application initiatives, an important part of one of FactSet’s highest profile and most strategic areas of investment and development. As the Full Stack Senior Software Engineer , you will design and develop Application Development including UI , API , Database frameworks and data engineering pipelines, help implement improvements to existing pipelines and infrastructure and provide production support. You will be collaborating closely with Product Developer/Business Analyst for capturing technical requirements. FactSet is happy to setup an information session with an Engineer working on this product to talk about the product, team and the interview process. What You’II Do Implement new components and Application Features for Client facing application as a Full Stack Developer. Maintain and resolve bugs in existing components Contribute new features, fixes, and refactors to the existing code Perform code reviews and coach engineers with respect to best practices Work with other engineers in following the test-driven methodology in an agile environment Collaborate with other engineers and Product Developers in a Scrum Agile environment using Jira and Confluence Ability to work as part of a geographically diverse team Ability to create and review documentation and test plans Estimate task sizes and regularly communicate progress in daily standups and biweekly Scrum meetings Coordinate with other teams across offices and departments What We’re Looking For Bachelor’s degree in Engineering or relevant field required. 5 to 7 years of relevant experience Expert level proficiency writing and optimizing code in Python. Proficient in frontend technologies such as Vue.js (preferred) or ReactJS and experience with JavaScript, CSS, HTML . Good knowledge of REST API Development, preferably Python Flask, Open API Good knowledge of Relational databases, preferably with MSSQL or Postgres Good Knowledge of GenAI and Vector Databases is a plus Good understanding of general database design and architecture principles A realistic, pragmatic approach. Can deliver functional prototypes that can be enhanced & optimized in later phases Strong written and verbal communication skills Working experience on AWS services, Lambda, EC2, S3, AWS Glue etc. Strong Working experience with any container / PAAS technology (Docker or Heroku) ETL and Data pipelines experience a plus. Working experience of Apache Spark, Apache Airflow, GraphQL, is a plus Experience in developing event driven distributed serverless Infrastructure (AWS-Lambda), SNS-SQS is a plus. Must be a Voracious Learner. What's In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn More About Our Benefits Here. Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn. At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Details Job Description: Power BI Analyst and Developer (with Python and Web Skills) We're seeking a Power BI Analyst and Developer to transform data into impactful insights and integrate these into dashboards and web solutions. This role is perfect for someone passionate about data visualization, data storytelling, and leveraging technology to drive business decisions. Responsibilities Design and develop interactive Power BI reports and dashboards. Build robust data models (DAX, Power Query) and perform ETL processes. Utilize Python for advanced data manipulation, analysis, and automation. Develop and integrate web components to display and interact with data insights. Collaborate with stakeholders to understand requirements and present findings. Qualifications Qualifications: 3+ years in Power BI development, with strong DAX and Power Query skills. Proficiency in SQL and Python for data analysis. Understanding of web development fundamentals (HTML, CSS, JavaScript). Strong analytical, problem-solving, and communication skills. Job Qualification: Bachelors or Masters degree from any stream with 5+ years with of hands-on experience Data analytics experience e.g., Python-SQL. Visualization tools e.g., Power BI, Tableau. Expertise in creating applications using various scripting languages. Job Type Experienced Hire Shift Shift 1 (India) Primary Location: India, Bangalore Additional Locations: Business Group The Data Center & Artificial Intelligence Group (DCAI) is at the heart of Intel’s transformation from a PC company to a company that runs the cloud and billions of smart, connected computing devices. The data center is the underpinning for every data-driven service, from artificial intelligence to 5G to high-performance computing, and DCG delivers the products and technologies—spanning software, processors, storage, I/O, and networking solutions—that fuel cloud, communications, enterprise, and government data centers around the world. Posting Statement All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Position of Trust N/A Work Model for this Role This role will require an on-site presence. * Job posting details (such as work model, location or time type) are subject to change.
Posted 5 days ago
4.0 - 7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 5 days ago
14.0 - 18.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About the Role : The role involves creating innovative solutions, guiding development teams, ensuring technical excellence, and driving architectural decisions aligned with company policies. The Solution Designer/Tech Lead will be a key technical advisor, collaborating with onshore teams and leadership to deliver high-impact Data and AI/ML projects. Responsibilities : Design and architect Generative AI solutions leveraging AWS services such as Bedrock, S3, PG Vector, Kendra, and SageMaker. Collaborate closely with developers to implement solutions, providing technical guidance and support throughout the development lifecycle. Lead the resolution of complex technical issues and challenges in AI/ML projects. Conduct thorough solution reviews and ensure adherence to best practices and company standards. Navigate governance processes and obtain necessary approvals for initiatives. Make critical architectural and design decisions aligned with organizational policies and industry best practices. Liaise with onshore technical teams, presenting solutions and providing expert analysis on proposed approaches. Conduct technical sessions and knowledge-sharing workshops on AI/ML technologies and AWS services. Evaluate and integrate emerging technologies and frameworks like LangChain into solution designs. Develop and maintain technical documentation, including architecture diagrams and design specifications. Mentor junior team members and foster a culture of innovation and continuous learning. Collaborate with data scientists and analysts to ensure optimal use of data in AI/ML solutions. Coordinate with clients, data users, and key stakeholders to achieve long-term objectives for data architecture. Stay updated on the latest trends and advancements in AI/ML and cloud and data technologies. Key Qualifications and experience: Extensive experience (14-18 years) in software development and architecture, with a focus on AI/ML solutions. Deep understanding of AWS services, particularly those related to AI/ML (Bedrock, SageMaker, Kendra, etc.). Proven track record in designing and implementing data, analytics, repor ting and/or AI/ML solutions. Strong knowledge of data structures, algorithms, and software design patterns. Expertise in data management, analytics, and reporting tools. Proficiency in at least one programming language commonly used in AI/ML (e.g., Python, Java, Scala). Familiarity with DevOps practices and CI/CD pipelines. Understanding of AI ethics, bias mitigation, and responsible AI principles. Basic understanding of data pipelines and ETL processes, with the ability to design and implement efficient data flows for AI/ML models. Experience in working with diverse data types (structured, unstructured, and semi-structured) and ability to preprocess and transform data for use in generative AI applications.
Posted 5 days ago
4.0 - 7.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 5 days ago
8.0 years
0 Lacs
India
Remote
Job Title: GCP Data Engineer Location: Remote (Only From India) Employment Type: Contract Long-Term Start Date: Immediate Time Zone Overlap: Must be available to work during EST hours (Canada) Dual Employment: Not permitted – must be terminated if applicable About the Role: We are looking for a highly skilled GCP Data Engineer to join our international team. The ideal candidate will have strong experience with Google Cloud Platform's data tools, particularly DataProc and BigQuery, and will be comfortable working in a remote, collaborative environment. You will play a key role in designing, building, and optimizing data pipelines and infrastructure that drive business insights. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes on GCP Leverage GCP DataProc and BigQuery to process and analyze large volumes of data Write efficient, maintainable code using Python and SQL Develop Spark-based data workflows using PySpark Collaborate with cross-functional teams in an international environment Ensure data quality, integrity, and security Participate in code reviews and optimize system performance Required Qualifications: 5–8 years of hands-on experience in Data Engineering Proven expertise in GCP DataProc and BigQuery Strong programming skills in Python and SQL Solid experience with PySpark for distributed data processing Fluent English with excellent communication skills Ability to work independently in a remote team environment Comfortable with working during Canada EST time zone overlap Optional / Nice-to-Have Skills: Experience with additional GCP tools and services Familiarity with CI/CD for data engineering workflows Exposure to data governance and data security best practices Interview Process: Technical Test (Online screening) 15-minute HR Interview Technical Interview with 1–2 rounds Please share only if you match above JD at hiring@khey-digit.com
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description : Strong experience on Core Java ( Data Structures, Algorithms ) Strong experience on OOPS Strong experience on ETL/ Oracle
Posted 5 days ago
5.0 years
0 Lacs
India
On-site
The Senior Integration Engineer will design, develop, and implement automations & integrations between various systems and applications. This role will collaborate with cross-functional teams to understand business requirements, translate them into efficient and scalable integration solutions, and drive continuous improvement in automation processes. The ideal candidate will have a strong understanding of enterprise systems, APIs, cloud-based technologies, and integration architecture. What You’ll Do ● Develop, design, and implement common architecture patterns, including microservices, event-driven architecture, and service-oriented architecture ● Design, develop, and implement integrations using the Workato platform to connect various systems and applications ● Collaborate with business stakeholders and technical teams to gather automation & integration requirements and translate them into technical specifications ● Build and maintain integration workflows, recipes, and connectors within the Workato platform ● Perform testing and debugging of integrations to ensure accuracy, reliability, and performance ● Monitor and troubleshoot integration issues, providing timely resolution and support ● Document integration processes, workflows, and configurations for future reference ● Stay up-to-date with the latest trends and best practices in integration technologies and tools ● Collaborate with cross-functional teams to ensure seamless integration of systems and applications ● Provide technical guidance and support to team members and end-users What You’ll Need ● Minimum qualifications ● 5+ years of proven experience as a Workato Integration Engineer or similar type of integration role ● Bachelor's degree in Computer Science, Information Systems, or an equivalent combination of education and experience ● Proficiency in at least one scripting language (Python, JavaScript, or similar) ● Experience with version control systems (Git) and CI/CD pipelines ● Strong SQL and database management skills across multiple database systems ● Experience with JSON, XML, and other data interchange formats Preferred qualifications ● Strong knowledge of integration concepts, methodologies, and best practices ● Proficiency in Workato platform and its features, including connectors, recipes, and workflows ● Strong proficiency in RESTful APIs and web services, with a deep understanding of API design principles and best practices ● Knowledge of integration patterns, data transformation, and ETL processes ● Experience in implementing integration projects, including complex enterprise systems and cloud business apps ● Familiarity with cloud-based systems and applications, such as Workiva, Salesforce, Marketo, Anaplan/Adaptive, NetSuite/SAP/Oracle, FinancialForce/Open Air, Gainsight, Xactly, etc ● Proactive in problem-solving, ready to take on additional initiatives and responsibilities as they arise ● Demonstrated success on high performing integration engineering teams ● Excellent written and oral communication skills, conveying complex technical concepts to non-technical audiences ● Ability to work independently and manage multiple projects simultaneously ● Knowledge of other integration platforms, such as MuleSoft or Dell Boomi, is a plus
Posted 5 days ago
2.0 years
0 Lacs
India
On-site
Impact you will make Under the direction of the HRIS Manager and HR leadership, the HRIS Analyst collaborates with key business stakeholders to create, support, and optimize business processes and HR initiatives to achieve departmental and company outcomes. Primarily leveraging all aspects of Dayforce HCM, or other applications where scale and efficiency are needed, the Analyst works with a high level of independence, collaboration, and strategic mindset to accomplish both short-term and long-term team and organizational objectives. The work is fast-paced, sometimes ambiguous and challenging, and requires a very organized, highly communicative team player who is capable of absorbing and synthesizing new skills and practices to get things done effectively. What you will do Document processes - identifying HR concerns and compiling data analysis reports Evaluate opportunities for continuous processes and governance improvements, and can partner to create a plan to execute improvements Incorporate and translate team and company objectives, end-user experiences, and best practices into the technical projects or solutions Provide hands on support and training with a high degree of customer-satisfaction to HR team, corporate leadership and other key stakeholders in the company Handle most day-to-day troubleshooting in our HRIS system, upgrades and escalations Manage assigned projects, implementation of new features, and other HCM/HRIS assignments, with high degree of sustainability, effectiveness and customer satisfaction Manage SaaS HCM upgrades, defect tracking, infrastructure patches and other vendor changes Partner with functional SMEs, gaining knowledge and suggesting overall best and scalable solutions as appropriate Build and support data interfaces, APIs, integrations Maintain data integrity and ensure compliance with relevant regulations Build, analyze, and support HR/Payroll data requests, and solutions for workforce business intelligence or reporting needs Effectively maintain and build operating procedures, workflows and SOPs and other documentation Stay updated on company and industry trends, products and services What you will bring Bachelor’s Degree, preferred. 2+ years’ proven experience handing US or India HR and Payroll operations, or HR certifications plus 1 year of HR experience. 1+ year implementing or executing solutions in Dayforce HCM; preferably with Reporting, API/Integration Studio, Dashboards/Analytics, Benefits, Payroll, WFM/Time tracking, Workflows/Forms, other Talent modules Very strong analytical and problem-solving skills, especially focused on large and disperate HR data, files or data interfaces. Advanced Microsoft Excel skills, including formulas,PivotTables, PowerUps and Lookup functions; MS ETL functions a plus. Highly organized, detail oriented and customer-focused. Ability to work independently and as part of a team, flexibility and persistent with positive can-do perspective. Excellent communication skills, including ability to effectively interface with colleagues at all levels of the organization What we would like to see Experience with India Human Resources and Payroll administration is a plus As is healthcare experience Physical Demands The physical demands and work environment characteristics described here are representative of those that an employee must meet to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. While performing the duties of this job, the employee is occasionally required to move around the work area; sit; perform manual tasks; operate tools and other office equipment such as computer, computer peripherals and telephones; extend arms; kneel; talk and hear Must occasionally lift and/or move up to 15 pounds Must be able to talk, listen and speak clearly on telephone or video calls Mental Demands: the employee must be able to follow directions, to get along with others, and handle stress Work environment: The noise level in the work environment is usually minimal Ability to safely and successfully perform the essential job functions consistent with the ADA, FMLA and other federal, state and local standards, including meeting qualitative and/or quantitative productivity standards Ability to maintain regular, punctual attendance consistent with the ADA, FMLA and other federal, state and local standards
Posted 5 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Salesforce Senior Developer Experience: Total : 5+ Years Relevant : 3+ Years Responsibilities: Meet with clients to determine business, functional and technical requirements and participate in application design, configuration, testing and deployment Perform configuration and customization of the Salesforce.com platform. Participate in efforts to develop and execute testing, training and documentation Participate in the sales cycle as needed (solution definition, pre-sales, estimating and project planning) Willing to be hands-on in producing tangible deliverables (requirements specifications, design deliverables, status reports, project plans) Proactively engage on continuous improvement efforts for application design, support, and practice development efforts. Provide technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance. Performs regular reviews on implementation done by less experienced developers and offer feedback and suggestions for those codes Mentors the junior and mid-level developers of the team, and can designate tasks to team members in a balanced and effective manner Sets up a development environment on their own, and has the ability to mentor a team of junior developers Independently communicate with both client technical teams and business owners as needed during the design and implementation Knowledge and Skill: 3+ years of experience working on Salesforce platforms At least Salesforce certification “Salesforce Platform Developer I” Direct experience working on CRM projects for middle market and enterprise size companies Working knowledge and experience with complex business systems integration as well as object-oriented design patterns and development Software engineering skills with Force.com Platform (Apex, LWC, SOQL, Unit Testing) Experience in core web technologies including HTML5, JavaScript and jQuery Demonstrated experience and knowledge of relational databases, data modelling, and ETL tools Experience with web services (REST & SOAP, JSON & XML, etc) Experience with Agile development methodologies such as SCRUM Excellent organizational, verbal and written communication skills EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 5 days ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Salesforce Senior Developer Experience: Total : 5+ Years Relevant : 3+ Years Responsibilities: Meet with clients to determine business, functional and technical requirements and participate in application design, configuration, testing and deployment Perform configuration and customization of the Salesforce.com platform. Participate in efforts to develop and execute testing, training and documentation Participate in the sales cycle as needed (solution definition, pre-sales, estimating and project planning) Willing to be hands-on in producing tangible deliverables (requirements specifications, design deliverables, status reports, project plans) Proactively engage on continuous improvement efforts for application design, support, and practice development efforts. Provide technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance. Performs regular reviews on implementation done by less experienced developers and offer feedback and suggestions for those codes Mentors the junior and mid-level developers of the team, and can designate tasks to team members in a balanced and effective manner Sets up a development environment on their own, and has the ability to mentor a team of junior developers Independently communicate with both client technical teams and business owners as needed during the design and implementation Knowledge and Skill: 3+ years of experience working on Salesforce platforms At least Salesforce certification “Salesforce Platform Developer I” Direct experience working on CRM projects for middle market and enterprise size companies Working knowledge and experience with complex business systems integration as well as object-oriented design patterns and development Software engineering skills with Force.com Platform (Apex, LWC, SOQL, Unit Testing) Experience in core web technologies including HTML5, JavaScript and jQuery Demonstrated experience and knowledge of relational databases, data modelling, and ETL tools Experience with web services (REST & SOAP, JSON & XML, etc) Experience with Agile development methodologies such as SCRUM Excellent organizational, verbal and written communication skills EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France