Jobs
Interviews

1052 Etl Processes Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

karnataka

On-site

The opportunity As a Data Migration Specialist, you will play a crucial role in developing and executing comprehensive data migration strategies. Your responsibilities will include analyzing legacy systems, designing and implementing ETL processes using SAP BusinessObjects Data Services (BODS), and optimizing BODS jobs for performance and reliability. You will provide functional expertise in SAP SD, MM, and PP modules to ensure accurate data alignment and drive data quality improvement initiatives. Collaboration with cross-functional teams and stakeholders will be essential for clear communication and documentation. Your background To excel in this role, you should hold a Bachelor's degree in Computer Science, IT, or a related field. With 8+ years of experience in SAP data migration, including SAP S/4HANA (preferred), you should be proficient in SAP BusinessObjects Data Services (BODS) and data migration tools. Strong functional knowledge of SAP SD, MM, and PP modules is required, along with skills in data analysis, cleansing, and transformation techniques. Your problem-solving, analytical, and communication skills will be put to the test as you conduct data validation, reconciliation, and testing to ensure data integrity. The ability to work independently and collaboratively in team environments is crucial for success in this role. Possessing SAP certification and project management experience would be an added advantage. Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. Requesting reasonable accommodations can be done by completing a general inquiry form on the website, including your contact information and specific details about the required accommodation to support you during the job application process. This accommodation assistance is exclusively for job seekers with disabilities needing accessibility support during the application process. Other inquiries will not be responded to.,

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a highly skilled Technical Data Analyst to join our team. You should have a strong technical background in Oracle PL/SQL and Python, along with expertise in data analysis tools and techniques. As the ideal candidate, you will be a strategic thinker capable of leading and mentoring a team of data analysts. Your role will involve driving data-driven insights and contributing to key business decisions. Additionally, you will be responsible for researching and evaluating emerging AI tools and techniques for potential application in data analysis projects. Your responsibilities will include designing, developing, and maintaining complex Oracle PL/SQL queries and procedures for data extraction, transformation, and loading processes. You will also use Python scripting for data analysis, automation, and reporting. Performing in-depth data analysis to identify trends, patterns, and anomalies will be crucial in providing actionable insights to enhance business performance. Collaboration with cross-functional teams to understand business requirements and translating them into technical specifications is essential. You will also be tasked with developing and maintaining data quality standards, ensuring data integrity across various systems, and utilizing data analysis and visualization tools to create interactive dashboards and reports for business stakeholders. Preferred qualifications for this role include hands-on experience as a Technical Data Analyst with expertise in Oracle PL/SQL and Python programming. Proficiency in Python scripting, expertise in data visualization tools such as Tableau, Power BI, or Qlik Sense, and an understanding of AI/ML tools and techniques in data analytics are also desired. Practical experience applying AI/ML techniques in data analysis projects is considered a plus. Strong analytical, problem-solving, communication, and interpersonal skills are essential, along with experience in the financial services industry. Qualifications for this position include 0-2 years of relevant experience, proficiency in programming/debugging used in business applications, working knowledge of industry practice and standards, comprehensive knowledge of a specific business area for application development, and working knowledge of program languages. Consistently demonstrating clear and concise written and verbal communication is also important. A Bachelor's degree or equivalent experience is required for this role. The mandatory skills required are Ab Initio, Oracle PL/SQL, and Unix/Linux. A minimum of 2 years of hands-on development experience is necessary for this position. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review Accessibility at Citi. Additionally, you can view Citis EEO Policy Statement and the Know Your Rights poster.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

At EY, youll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As an EY GDS Consulting Senior, you will contribute technically and functionally to GRC Technology client engagements and internal projects. You will identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you will anticipate and identify risks within engagements and share any issues with senior members of the team. We are seeking an experienced IT Architecture and Data Management Consultant to join our team. The ideal candidate will play a pivotal role in defining, setting up, and optimizing architecture and data pipelines specifically for risk and GRC (Governance, Risk, and Compliance) technologies. You will analyze existing IT environments and, when necessary, assist clients in developing new custom solutions to meet their needs. Your key responsibilities include: - Defining and implementing IT Architecture and data pipelines tailored for risk and GRC technologies. - Analyzing clients" existing IT environments to identify areas for improvement and optimization. - Designing and implementing ETL processes and workflows. - Understanding data lake concepts, structures, and best practices. - Evaluating and selecting appropriate technologies and tools for data ingestion and normalization. - Collaborating with clients to understand their requirements and provide customized solutions. - Implementing automation for data ingestion processes to reduce manual effort and minimize errors. - Designing systems and processes that can scale with the growing data needs of the organization. - Developing and maintaining documentation for architecture and data management processes. - Ensuring data integrity, security, and compliance with industry standards and best practices. - Identifying and integrating diverse data sources ensuring compatibility and consistency. - Providing technical guidance and support to clients and internal teams. - Staying updated with the latest trends and advancements in IT architecture, data engineering, and GRC technologies. - Solution design to include integration of AI/Gen AI/Microservices for document/data/access management, 3rd party integrations, and cloud environment management and monitoring. - Understanding solutions architecture design patterns and creating solution architectures to client CIO/CTOs. - Driving customer requirements show back sessions, system demos, and other workshops. - Ability to work as a team member to contribute to various phases of projects. - Assisting EY regional teams on RFP pursuits and proposals for clients seeking GRC/IRM support on areas such as IT Architecture, Data ingestion & solution design. - Developing and maintaining productive working relationships with client personnel. - Demonstrating flexibility to travel to the customer locations on a need basis. - Ensuring on-time delivery of allocated tasks. - Ensuring adherence to quality processes specified for the project. - Compliance with EY policies and procedures like Timesheet / Leaves management, etc. - Assisting Project Lead for the successful execution of the project (estimation, reviews, customer satisfaction, etc.). Skills and attributes for success: - Strong communication, presentation, and team-building skills and experience in producing high-quality reports, papers, and presentations. - Fostering teamwork, quality culture, and leading by example. - Understanding and following workplace policies and procedures. - Training and mentoring of project resources. - Participating in organization-wide people initiatives. To qualify for the role, you must have: - 3-7 years of experience in GRC technology and solutions. - Strong understanding of IT architecture principles and data engineering best practices. - Proven experience in setting up and managing data pipelines. - Excellent analytical and problem-solving skills. - Ability to communicate complex technical concepts to non-technical stakeholders. - Strong project management skills and the ability to work independently and as part of a team. - Basic understanding of cross GRC domains including Information Security, business continuity, and Risk Management. - Team Building - Knowledge sharing, training, motivating, and development of team members. Ideally, you should also have: - Bachelor's degree in Information Technology, Computer Science, or a related field with a minimum of 3+ years of experience with other Big3 or panelled SI/ ITeS companies. - Familiarity with a typical IT systems development life cycle. - Experience with specific GRC tools and platforms. - Knowledge of industry standards and regulations related to risk and compliance. - Certification in relevant IT architecture or data management areas. - Exposure to multiple GRC tools like Archer, ServiceNow, MetricStream, Enablon, etc. would be an added advantage. What we look for: - A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment with consulting skills. - An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide. - Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. What working at EY offers: At EY, were dedicated to helping our clients, from startups to Fortune 500 companies and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from some of the most engaging colleagues around. - Opportunities to develop new skills and progress your career. - The freedom and flexibility to handle your role in a way thats right for you. EY | Building a better working world: EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About G2 - The Company When you join G2, youre joining the team that helps businesses reach their peak potential by powering decisions and strategies with trusted insights from real software users. G2 is the world&aposs largest and most trusted software marketplace. More than 100 million people annually including employees at all Fortune 500 companies use G2 to make smarter software decisions based on authentic peer reviews. Thousands of software and services companies of all sizes partner with G2 to build their reputation and grow their business including Salesforce, HubSpot, Zoom, and Adobe. To learn more about where you go for software, visit www.g2.com and follow us on LinkedIn. As we continue on our growth journey, we are striving to be the most trusted data source in the age of AI for informing software buying decisions and go-to-market strategies. Does that sound exciting to you Come join us as we try to reach our next PEAK! About G2 - Our People At G2, we have big goals, but we stay grounded in our PEAK ( P erformance + E ntrepreneurship + A uthenticity + K indness) values. Youll be part of a value-driven, growing global community that climbs PEAKs together. We cheer for each others successes, learn from our mistakes, and support and lean on one another during challenging times. With ambition and entrepreneurial spirit we push each other to take on challenging work, which will help us all to grow and learn. You will be part of a global, diverse team of smart, dedicated, and kind individuals - each with unique talents, aspirations, and life experiences. At the heart of our community and culture are our people-led ERGs, which celebrate and highlight the diverse identities of our global team. As an organization, we are intentional about our DEI and philanthropic work (like our G2 Gives program) because it encourages us all to be better people. About The Role G2 is looking for a Data Engineer I, you&aposll be actively involved in designing and implementing data pipelines, focusing on ETL processes, and contributing to the optimization of data solutions within AWS and Snowflake environments. You&aposll work on data extraction, transformation, and loading, ensuring reliability and efficiency in data handling for the various use cases in our G2 Data Platform. In This Role, You Will Data Pipeline/ELT : Design and develop data pipelines for data extraction, transformation, and loading of data from various sources into G2 Data Platform. Collaborate and actively take part in optimizing existing data pipelines for performance, scalability, and reliability, contributing to the enhancement of data workflows. Participate in the full development cycle for ETL: design, implementation, validation, documentation, and maintenance. Contribute to documenting processes, best practices, and solutions for future reference. Service data requests from various users of G2 data platform. Demonstrate excellent coding and debugging skills. Business, Data Understanding And Impact Designing and delivering data models/schemas Take part in architectural planning and implementation of the data platform, aligning solutions with organizational goals and industry best practices. As needed, assist other teams with reporting, debugging data accuracy issues and other related functions. Reporting on data quality and recommend ways to increase data cleanliness and improve our cleansing activities Mentorship And Collaboration Engage in continuous learning, staying updated on emerging trends and techniques in data engineering. Collaborate with peers, actively participating in knowledge sharing sessions and contributing to a collaborative team environment. Seek guidance and mentorship from senior team members to enhance technical and analytical skills. Minimum Qualifications We realize applying for jobs can feel daunting at times. Even if you dont check all the boxes in the job description, we encourage you to apply anyway. 2+ years of experience as a data engineer or ETL developer. 1+ years of development experience with sound skills in data modeling, database programming and data architecture. Experience in the design and development of data pipelines using Cloud and Open source tools. 2+ years experience in Python and SQL programming Must have good knowledge of optimization and debugging of data pipelines. Basic knowledge about AWS services and cloud databases. Proficiency in handling structured and unstructured data. What Can Help Your Application Stand Out Experience with EDW and OLAP solutions. Working knowledge of SnowFlake. Experience with Big Data framework such as Spark, Treno etc Our Commitment to Inclusivity and Diversity At G2, we are committed to creating an inclusive and diverse environment where people of every background can thrive and feel welcome. We consider applicants without regard to race, color, creed, religion, national origin, genetic information, gender identity or expression, sexual orientation, pregnancy, age, or marital, veteran, or physical or mental disability status. Learn more about our commitments here. -- For job applicants in California, the United Kingdom, and the European Union, please review this applicant privacy notice before applying to this job. How We Use AI Technology In Our Hiring Process G2 incorporates AI-powered technology to enhance our candidate evaluation process. These tools may assist with initial application screening, skills assessment analysis, and identifying candidates whose qualifications align with specific role requirements. While AI technology supports our recruitment workflow, all final hiring decisions remain under human oversight and judgment. Your Choice Matters: If you would prefer that your application be reviewed without AI assistance, you can opt out by entering your email address in the email entry field at the bottom of the Automated Processing Legal Notice. Choosing to opt out will not disadvantage your application in any waywe will ensure your materials receive a thorough manual review by our hiring team. For additional details about how we handle your information throughout the application process, please review G2&aposs Applicant Privacy Notice. Show more Show less

Posted 1 month ago

Apply

3.0 - 6.0 years

4 - 9 Lacs

Mumbai, Maharashtra, India

On-site

Responsibilities: Design, develop, and maintain ETL jobs using SAP Data Services (BODS) to extract, transform, and load data from SAP and non-SAP systems. Perform data profiling, cleansing, and validation to ensure data accuracy and consistency. Support data migration and data integration projects involving SAP ECC, S/4HANA, and cloud-based targets (e.g., AWS Redshift, S3). Collaborate with functional teams and business stakeholders to gather data requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize BODS job performance, ensuring timely and accurate data loads. Participate in unit testing, integration testing, and UAT support for ETL solutions. Create and maintain comprehensive documentation of ETL processes, data flows, and error handling routines. Work closely with senior consultants and architects for design reviews and best practices implementation. Skills & Qualifications: 3-6 years of hands-on experience in SAP Data Services (BODS) development and job management. Solid understanding of ETL processes, data warehousing concepts, and data quality management. Strong SQL skills with experience in database tuning (Oracle, SQL Server, or HANA). Experience in extracting data from SAP modules (ECC/S/4HANA) using IDOCs, RFCs, or extractors. Basic understanding or exposure to SAP BW/BW/4HANA data models, InfoProviders, and data flows (Good to Have). Familiarity with job scheduling tools (Control-M, AutoSys) is desirable. Exposure to cloud environments (AWS S3, Redshift, Azure) is an advantage. Strong analytical, problem-solving, and communication skills. Bachelor s degree in Computer Science, Information Systems, Engineering, or a related field. Business Skills Excellent oral and written communication skills, the ability to communicate with others clearly and concisely. Experience with Microsoft Office suite including Word, Excel, PowerPoint. Understands business processes for focus areas or modules. Ability to do research and perform detailed tasks. Strong analytical skills. Consulting Skills Aptitude for working in a team environment; problem-solving skills, creative thinking, communicating clearly and empathetically, strong time management, and ability to collaborate with all levels of staff. Learn/understand consulting soft skills necessary on engagements, as well as with team collaborative initiatives. Strong presentation skills. General Skills/Tasks Understands client s business and technical environment. Assists the project team efforts in documenting the developing solutions for client situations. Assists team effort in preparing and developing solution documentation for projects. Completes assignments within the budget, meets project deadlines, makes and keeps sensible commitments to the team. Meets billing efficiency targets, complies with all administrative responsibilities in a timely and effective manner. Learn to understand and adhere to project and organization guidelines with all administrative responsibilities in a timely and effective manner. Keeps manager apprised of workload direction and concerns. Learn to analyse and develop reliable solutions that produce efficient and effective outcomes. Develop a deeper understanding of SAP methodologies, tools, standards, and techniques. Assists with project documentation, and demonstrates effective organizational skills, with minimal supervision. Adopt learning quality standards and correctly prioritizes own activities following the project plan. Provides project team and leaders with updates on the progress and difficulties encountered, and provides value-added insight and understanding, for future program development.

Posted 1 month ago

Apply

4.0 - 12.0 years

0 - 14 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities Designing and implementing SAP HANA solutions based on business requirements. Optimizing existing HANA models for performance and efficiency. Collaborating with cross-functional teams to gather requirements and deliver solutions. Providing technical support and troubleshooting for SAP HANA applications. Developing and maintaining documentation for SAP HANA architecture and processes. Conducting performance tuning and optimization of HANA databases. Skills and Qualifications 4-12 years of experience in SAP HANA development and implementation. Strong understanding of SAP HANA architecture and data modeling concepts. Experience with SQL and SQLScript for data manipulation and reporting. Knowledge of ETL processes and tools related to SAP HANA. Familiarity with SAP BW and SAP BusinessObjects is a plus. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills.

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior Data Analyst with a minimum of 7-8 years of experience in data analysis roles, specifically with significant exposure to Snowflake. Your primary responsibilities will include querying and analyzing data stored in Snowflake databases to derive meaningful insights for supporting business decision-making. You will also be responsible for developing and maintaining data models and schema designs within Snowflake to facilitate efficient data analysis. In addition, you will create and maintain data visualizations and dashboards using tools like Tableau or Power BI, leveraging Snowflake as the underlying data source. Collaboration with business stakeholders to understand data requirements and translate them into analytical solutions is a key aspect of this role. You will also perform data validation, quality assurance, and data cleansing activities within Snowflake databases. Furthermore, you will support the implementation and enhancement of ETL processes and data pipelines to ensure data accuracy and completeness. A Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Information Systems, or a related field is required. Certifications in data analytics, data visualization, or cloud platforms are desirable but not mandatory. Your primary skills should encompass a strong proficiency in querying and analyzing data using Snowflake SQL and DBT. You must have a solid understanding of data modeling and schema design within Snowflake environments. Experience in data visualization and reporting tools such as Power BI, Tableau, or Looker is essential for analyzing and presenting insights derived from Snowflake. Familiarity with ETL processes and data pipeline development is also crucial, along with a proven track record of using Snowflake for complex data analysis and reporting tasks. Strong problem-solving and analytical skills, including the ability to derive actionable insights from data, are key requirements. Experience with programming languages like Python or R for data manipulation and analysis is a plus. Secondary skills that would be beneficial for this role include knowledge of cloud platforms and services such as AWS, Azure, or GCP. Excellent communication and presentation skills, strong attention to detail, and a proactive approach to problem-solving are also important. The ability to work collaboratively in a team environment is essential for success in this position. This role is for a Senior Data Analyst specializing in Snowflake, based in either Trivandrum or Bangalore. The working hours are 8 hours per day from 12:00 PM to 9:00 PM, with a few hours of overlap during the EST time zone for mandatory meetings. The close date for applications is 18-04-2025.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Bridgnext, you will be responsible for working on internal and customer-based projects. Your primary focus will be on ensuring the quality of the code and providing optimal solutions to meet client requirements while anticipating their future needs based on market understanding. Your experience with Hadoop projects, including data processing and representation using various AWS services, will be valuable in this role. You should have at least 4 years of experience in data engineering, with a specialization in big data technologies such as Spark and Kafka. A minimum of 2 years of hands-on experience with Databricks is essential for this position. A strong understanding of data architecture, ETL processes, and data warehousing is necessary, along with proficiency in programming languages like Python or Java. Experience with cloud platforms such as AWS, Azure, and GCP, as well as familiarity with big data tools, will be beneficial. Excellent communication, interpersonal, and leadership skills are required to effectively collaborate with team members and clients. You should be able to work in a fast-paced environment, managing multiple priorities efficiently. In addition to technical skills, you should possess solid written, verbal, and presentation communication abilities. Being a strong team player while also capable of working independently is crucial. Maintaining composure in various situations, collaborative nature, high standards of professionalism, and consistently delivering high-quality results are expected from you. Your self-sufficiency and openness to creative solutions will be key in addressing any challenges that may arise in the role.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in BI and Analytics (Alteryx, SQL, Tableau), and you have found the right team. As a Business Intelligence Developer Associate within our Asset and Wealth Management Finance Transformation and Analytics team, you will be tasked with defining, refining, and achieving set objectives for our firm on a daily basis. You will be responsible for designing the technical and information architecture for the MIS (DataMarts) and Reporting Environments. Additionally, you will support the MIS team in query optimization and deployment of BI technologies, including but not limited to Alteryx, Tableau, MS SQL Server (T-SQL programming), SSIS, and SSRS. You will scope, prioritize, and coordinate activities with the product owners, design and develop complex queries for data inputs, and work on agile improvements by sharing experiences and knowledge with the team. Furthermore, you will advocate and steer the team to implement CI/CD (DevOps) workflow and design and develop complex dashboards from large and/or different data sets. The ideal candidate for this position will be highly skilled in reporting methodologies, data manipulation & analytics tools, and have expertise in the visualization and presentation of enterprise data. Required qualifications, capabilities, and skills include a Bachelor's Degree in MIS, Computer Science, or Engineering. A different field of study with significant professional experience in BI Development is also acceptable. Strong DW-BI skills are required with a minimum of 7 years of experience in Data warehouse and visualization. You should have strong work experience in data wrangling tools like Alteryx and working proficiency in Data Visualizations Tools, including but not limited to Alteryx, Tableau, MS SQL Server (SSIS, SSRS). Working knowledge in querying data from databases such as MS SQL Server, Snowflake, Databricks, etc., is essential. You must have a strong knowledge of designing database architecture, building scalable visualization solutions, and the ability to write complicated yet efficient SQL queries and stored procedures. Experience in building end-to-end ETL processes, working with multiple data sources, handling large volumes of data, and converting data into information is required. Experience in the end-to-end implementation of Business Intelligence (BI) reports & dashboards, as well as good communication and analytical skills, are also necessary. Preferred qualifications, capabilities, and skills include exposure to Data Science and allied technologies like Python, R, etc. Exposure to automation tools like UIPath, Blue Prism, Power Automate, etc., working knowledge of CI/CD workflows and automated deployment, and experience with scheduling tools like Control M are considered advantageous.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Business Intelligence Specialist at Adobe, you will have the opportunity to work closely with Business analysts to understand design specifications and translate requirements into technical models, dashboards, reports, and applications. Your role will involve collaborating with business users to cater to their ad-hoc requests and deliver scalable solutions on MSBI platforms. You will be responsible for system integration of data sources, creating technical documents, and ensuring data and code quality through standard methodologies and processes. To succeed in this role, you should have at least 3 years of experience in SSIS, SSAS, Data Warehousing, Data Analysis, and Business Intelligence. You should also possess advanced proficiency in Data Warehousing tools and technologies, including databases, SSIS, and SSAS, along with in-depth understanding of Data Warehousing principles and Dimensional Modeling techniques. Hands-on experience in ETL processes, database optimization, and query tuning is essential. Familiarity with cloud platforms such as Azure and AWS, as well as Python or PySpark and Databricks, would be beneficial. Experience in creating interactive dashboards using Power BI is an added advantage. In addition to technical skills, strong problem-solving and analytical abilities, quick learning capabilities, and excellent communication and presentation skills are important for this role. A Bachelor's degree in Computer Science, Information Technology, or an equivalent technical discipline is required. At Adobe, we value a free and open marketplace for all employees and provide internal growth opportunities for your career development. We encourage creativity, curiosity, and continuous learning as part of your career journey. To prepare for internal opportunities, update your Resume/CV and Workday profile, explore the Internal Mobility page on Inside Adobe, and check out tips to help you prep for interviews. The Talent Team will reach out to you within 2 weeks of applying for a role via Workday, and if you move forward in the interview process, inform your manager for support in your career growth. Join Adobe to work in an exceptional environment with colleagues committed to helping each other grow through ongoing feedback. If you are looking to make an impact and grow your career, Adobe is the place for you. Discover more about employee experiences on the Adobe Life blog and explore the meaningful benefits we offer. For any accommodation needs during the application process, please contact accommodations@adobe.com.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Be a part of a dynamic team and excel in an environment that values diversity and creativity. Continue to sharpen your skills and ambition while pushing the industry forward. As a Data Architect at JPMorgan Chase within the Employee Platforms, you serve as a seasoned member of a team to develop high-quality data architecture solutions for various software applications and platforms. By incorporating leading best practices and collaborating with teams of architects, you are an integral part of carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. In this role, you will be responsible for designing and implementing data models that support our organization's data strategy. You will work closely with Data Product Managers, Engineering teams, and Data Governance teams to ensure the delivery of high-quality data products that meet business needs and adhere to best practices. Job responsibilities include: - Executing data architecture solutions and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions and break down problems. - Collaborating with Data Product Managers to understand business requirements and translate them into data modeling specifications. Conducting interviews and workshops with stakeholders to gather detailed data requirements. - Creating and maintaining data dictionaries, entity-relationship diagrams, and other documentation to support data models. - Producing secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. - Evaluating data architecture designs and providing feedback on recommendations. - Representing the team in architectural governance bodies. - Leading the data architecture team in evaluating new technologies to modernize the architecture using existing data standards and frameworks. - Gathering, analyzing, synthesizing, and developing visualizations and reporting from large, diverse data sets in service of continuous improvement of data frameworks, applications, and systems. - Proactively identifying hidden problems and patterns in data and using these insights to drive improvements to coding hygiene and system architecture. - Contributing to data architecture communities of practice and events that explore new and emerging technologies. Required qualifications, capabilities, and skills: - Formal training or certification in Data Architecture and 3+ years of applied experience. - Hands-on experience in data platforms, cloud services (e.g., AWS, Azure, or Google Cloud), and big data technologies. - Strong understanding of database management systems, data warehousing, and ETL processes. - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. - Knowledge of data governance principles and best practices. - Ability to evaluate current technologies to recommend ways to optimize data architecture. - Hands-on practical experience in system design, application development, testing, and operational stability. - Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming and database querying languages. - Overall knowledge of the Software Development Life Cycle. - Solid understanding of agile methodologies such as continuous integration and delivery, application resiliency, and security. - Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills: - Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). - Familiarity with big data technologies (e.g., Hadoop, Spark). - Certification in data modeling or data architecture.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and optimizing interactive dashboards using Looker and LookML. This includes building LookML models, explores, and derived tables to meet business intelligence needs. You will create efficient data models and queries using BigQuery and collaborate with data engineers, analysts, and business teams to translate requirements into actionable insights. Implementing security and governance policies within Looker to ensure data integrity and controlled access will also be part of your role. Additionally, you will leverage GCP services to build scalable and reliable data solutions and optimize dashboard performance using best practices in aggregation and visualization. Maintaining, auditing, and enhancing existing Looker dashboards, reports, and LookML assets, as well as documenting dashboards, data sources, and processes for scalability and ease of maintenance, are critical tasks. You will also support legacy implementations and facilitate smooth transitions, build new dashboards and visualizations based on evolving business requirements, and work closely with data engineering teams to define and validate data pipelines for timely and accurate data delivery. To qualify for this role, you should have at least 6 years of experience in data visualization and BI, particularly using Looker and LookML. Strong SQL skills with experience optimizing queries for BigQuery are required, along with proficiency in Google Cloud Platform (GCP) and related data services. An in-depth understanding of data modeling, ETL processes, and database structures is essential, as well as familiarity with data governance, security, and role-based access in Looker. Experience with BI lifecycle management, strong communication and collaboration skills, good storytelling and user-centric design abilities, and exposure to the media industry (OTT, DTH, Web) handling large datasets are also necessary. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus, and experience with Python or other scripting languages for automation and data transformation is desirable. Exposure to machine learning or predictive analytics is considered an advantage.,

Posted 1 month ago

Apply

9.0 - 13.0 years

0 Lacs

chennai, tamil nadu

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a T24 BA_Data Migration - Senior Manager, you will lead the end-to-end analysis and execution of data migration initiatives across complex enterprise systems. This role demands deep expertise in data migration strategies, strong analytical capabilities, and a proven ability to work with cross-functional teams, including IT, business stakeholders, and data architects. You will be responsible for defining migration requirements, leading data mapping and reconciliation efforts, ensuring data integrity, and supporting transformation programs from legacy systems to modern platforms. As a senior leader, you will also play a critical role in stakeholder engagement, risk mitigation, and aligning data migration efforts with broader business objectives. The ideal candidate should be well versed in Technical aspects of the product and experienced in Data Migration activities. They should have a good understanding of the T24 architecture, administration, configuration, and data structure. Additionally, the candidate should have design and development experience in Infobasic, Core Java, EJB, and J2EE Enterprise, as well as working experience and/or knowledge of INFORMATICA. In-depth experience in End-to-End Migration tasks, right from Migration strategy, ETL process, and data reconciliation is required. Experience in relational or hierarchical databases including Oracle, DB2, Postgres, MySQL, and MSSQL is a must. Other mandatory requirements include the willingness to work out of the client location in Chennai for 5 days a week. The candidate should possess an MBA/MCA/BE/B.Tech or equivalent with a sound industry experience of 9 to 12 years. Your client responsibilities will involve working as a team lead in one or more T24 projects, interface and communicate with onsite coordinators, completion of assigned tasks on time, regular status reporting to the lead and manager, and interface with customer representatives as needed. You should be ready to travel to customer locations on a need basis. Your people responsibilities will include building a quality culture, managing the performance management for direct reportees, fostering teamwork, leading by example, training and mentoring project resources, and participating in the organization-wide people initiatives. Preferred skills include database administration, performance tuning, and prior client-facing experience. EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

As a PowerBI Developer, you will be responsible for developing and maintaining scalable data pipelines using Python and PySpark. You will collaborate with data engineers and data scientists to fulfill data processing needs and optimize existing PySpark applications for performance improvements. Writing clean, efficient, and well-documented code following best practices is a crucial part of your role. Additionally, you will participate in design and code reviews, develop and implement ETL processes, and ensure data integrity and quality throughout the data lifecycle. Staying current with the latest industry trends and technologies in big data and cloud computing is essential. The ideal candidate should have a minimum of 6 years of experience in designing and developing advanced Power BI reports and dashboards. Working experience on data modeling, DAX calculations, developing and maintaining data models, creating reports and dashboards, analyzing and visualizing data, ensuring data governance and compliance, as well as troubleshooting and optimizing Power BI solutions. Preferred skills for this role include strong proficiency in Power BI Desktop, DAX, Power Query, and data modeling. Experience in analyzing data, creating visualizations, building interactive dashboards, connecting to various data sources, and transforming data is highly valued. Excellent communication and collaboration skills are necessary to work effectively with stakeholders. Familiarity with SQL, data warehousing concepts, and experience with UI/UX development would be beneficial.,

Posted 1 month ago

Apply

7.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

The Data Science and Engineering Specialist role requires 7-12 years of total experience and is based in Chennai on a permanent hire basis. As a Data Science and Engineering Specialist, you will utilize data science techniques and engineering principles to develop and optimize data-driven solutions. Your role will involve combining analytical skills with technical expertise to create innovative models and systems that drive business insights and efficiencies. Your main responsibilities will include developing and implementing data models and algorithms, designing and optimizing data pipelines and workflows, collaborating with cross-functional teams to understand data needs, analyzing large datasets for actionable insights, creating and maintaining documentation for data processes, ensuring data quality and integrity, developing and deploying machine learning models, performing data preprocessing and feature engineering, monitoring and improving data system performance, and staying updated on the latest data science and engineering trends. To succeed in this role, you should have proficiency in data science tools and languages such as Python and R, a strong understanding of machine learning algorithms, expertise in data engineering and ETL processes, excellent problem-solving and analytical skills, the ability to work with large and complex datasets, strong communication and collaboration abilities, knowledge of cloud platforms and big data technologies, experience with data visualization tools, an understanding of software development principles, and the ability to translate business requirements into technical solutions. If you are interested in this opportunity, please share the following details to akshay.ganar@atos.net: - Total IT Experience - Experience in Data Science - Experience in Python, R programming - Current CTC - Expected CTC - Notice Period/LWD - Offer if any - Current Location - Preferred Location Once your profile is shortlisted, you will be contacted for further discussions.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Global Data Steward at Axalta's facility in Gurugram, Haryana, you will play a crucial role in ensuring the smooth operation of business processes by managing master data objects such as creation, update, obsolescence, reactivation, and accurate data maintenance in the system. Your responsibilities will include collaborating with business teams to clarify requests, maintaining data quality, testing data creations/updates, and mentoring team members. You will be required to work on daily business requests within defined SLA timelines and engage in additional tasks/projects that may involve multiple team interactions. To excel in this role, you should have hands-on experience in master data creation and maintenance, particularly in areas such as Material, Vendor, Pricing, Customer, PIRs, Source List, and BOM data. Proficiency in SAP toolsets related to data management, data extraction programs, ETL processes, data quality maintenance, and cleansing is essential. Knowledge of Request Management tools like SNOW and Remedy, as well as understanding key database concepts and data models, will be beneficial. An ideal candidate for this position would possess professional experience of 5-6 years, with expertise in Data Management Processes, SAP modules (MM/PP or OTC), and IT tools. Strong communication skills, stakeholder alignment, and the ability to interact with international colleagues are crucial. Additionally, you should demonstrate a strong ownership focus, drive to excel, and the ability to resolve conflicts, collaborate, and work effectively as a team player. Flexibility to work in shifts is also required for this role. Axalta, a leading company in the coatings industry, operates in two segments - Performance Coatings and Mobility Coatings, serving various end markets across the globe. With a commitment to sustainability and carbon neutrality, Axalta aims to deliver innovative solutions that protect and enhance products while contributing to a more sustainable future. Join us in our mission to optimize businesses and achieve common goals across diverse geographies and industries.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

About Invenio Invenio is the largest independent global SAP solutions provider serving the public sector, as well as offering specialist skills in media and entertainment. We bring deep expertise combined with advanced technologies to enable organizations to modernize so they can operate at the speed of today's business. We understand the complexities of international businesses and public sector organizations, working with stakeholders to drive change and create agile organizations of tomorrow using the technologies of today. Learn more at www.invenio-solutions.com. Role - SAP BO BW Senior Consultant Location - Delhi/Mumbai/Pune/Noida/Hyderabad Responsibilities Document all technical and functional specifications for implemented solutions. Proficient in BW/B4H & ABAP/CDS with experience in the areas of Analysis, Design, Development. Collaborate with clients to gather business requirements and translate them into BI/BW technical solutions. Interact with key stakeholders/support members in different areas of BW. Provide technical solutions to fulfill business requests using SAP's BW. Design, develop, configure, migrate, test and implement SAP BW 7.x data warehousing solutions using SAP BW, BW/4HANA, and related tools. Ensure data accuracy, integrity, and consistency in the SAP landscape. Optimize performance of queries, reports, and data models for better efficiency. Manage delivery of services against agreed SLAs as well as manage escalations both internally and externally. Understand client business requirements, processes, objectives, and possess the ability to develop necessary product adjustments to fulfill clients" needs. Develop process chains to load and monitor data loading. Provide technical guidance and mentorship to junior consultants and team members. Design and build data flows including Info Objects, Advanced Datastore Objects (ADSO), Composite Providers, Transformations, DTPs, and Data Sources. Conduct requirement gathering sessions and provide a design thinking approach. Work closely with clients to understand their business needs and provide tailored solutions. Build and maintain strong relationships with key stakeholders, ensuring satisfaction and trust. Manage and mentor a team of consultants, ensuring high-quality delivery and skill development. Facilitate knowledge sharing and promote the adoption of new tools and methodologies within the team. Act as an escalation point for technical and functional challenges. Well experienced in handling P1 and P2 situations. Skills & Qualifications Bachelor's Degree in IT or equivalent 6 to 8 years of experience in one or more SAP modules. At least four full life cycle SAP BW implementations and at least two with BI 7.x experience (from Blueprint/Explore through Go-Live). Ability to use Service Marketplace to create tickets, research notes, review release notes and solution roadmaps as well as provide guidance to customers on release strategy. Exposure to other SAP modules and integration points. Strong understanding of SAP BW architecture, including BW on HANA, BW/4HANA, and SAP S/4HANA integration. Knowledge of SAP ECC, S/4HANA, and other SAP modules. Proficiency in SAP BI tools such as SAP BusinessObjects, SAP Lumira, and SAP Analytics Cloud. Experience with data modeling, ETL processes, and SQL. Certifications in SAP Certified Application Associate - SAP Business Warehouse (BW), SAP Certified Application Associate - SAP HANA. Should be well-versed to get the data through different extraction methods. Flexible to work in shifts based on the project requirement. Strong skills in SAP BI/BW, BW/4HANA, and BW on HANA development and production support experience. Excellent communication, client management, and stakeholder engagement abilities. Extensively worked on BW user exits, start routines, end routines with expertise in ABAP/4. Extensively worked on standard data source enhancements and info provider enhancements. In-depth knowledge and understanding of SAP BI Tools such as Web Intelligence, Analysis for Office, Query Designer. Has end-to-end experience: can independently investigate issues from Data Source/Extractor to BI Report level problem-solving skills. Has end-to-end Development experience: can build extractors, model within SAP BW, and develop Reporting solutions, including troubleshooting development issues. Business Skills Excellent oral and written communication skills, the ability to communicate with others clearly and concisely. Understands business processes for focus areas or modules. Ability to do research and perform detailed tasks. Strong analytical skills. Understands business functionality related to SAP module/application as well as can identify and understand touchpoints between modules. Understands how to solve detailed SAP problems. Understands and can explain best business practices, especially those that SAP enables. Consulting Skills Aptitude for working in a team environment; problem-solving skills, creative thinking, communicating clearly and empathetically, strong time management, and the ability to collaborate with all levels of staff. Learn/understand consulting soft skills necessary on engagements, as well as with team collaborative initiatives. Ability to interpret requirements and apply SAP best practices. Strong presentation skills. General Skills/Tasks Understands clients" business and technical environment. Assists the project team efforts in documenting the developing solutions for client situations. Assists team effort in preparing and developing solution documentation for projects. Learn to understand and adhere to project and organization guidelines with all administrative responsibilities in a timely and effective manner. Keeps the manager apprised of workload direction and concerns. Learn to analyze and develop reliable solutions that produce efficient and effective outcomes. Develop a deeper understanding of SAP methodologies, tools, standards, and techniques. Assists with project documentation and demonstrates effective organizational skills, with minimal supervision. Provides project team and leaders with updates on the progress and difficulties encountered, and provides value-added insight and understanding, for future program development. Demonstrate the ability to accomplish project assignments resulting in quality service.,

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Role: Senior Data Analytics Consultant Summary: The Data Analytics Consultant is a high-impact role requiring a convergence of deep analytical expertise, advanced technological adeptness, and exceptional collaborative capabilities. This professional will define and drive data strategies that optimize business outcomes through innovative analytics and predictive modeling. Proficient in SQL, Power BI, Alteryx, the candidate will also be familiar with data modelling, cloud data solutions and data quality management practices. Leadership and stakeholder management skills are pivotal, ensuring that data insights translate into actionable business strategies across the organization. The ideal candidate will possess refined problem-solving skills, advanced technical capabilities, and a track record of strategic impact in data analytics. Work Experience 3+ years of total experience in data analytics, with at least 2 years of extensive hands-on experience in SQL, Alteryx, PowerBI and relational database concepts Expertise in Power BI, including Power Query, Data Modeling, and Visualization. In-depth knowledge of DAX for creating complex calculations, measures, and custom columns. Proven experience with Row Level Security (RLS) in Power BI to control data access based on user roles. Utilize Alteryx to design, develop, and maintain robust data workflows that automate data extraction, transformation, and loading (ETL) processes. Ensure data quality and integrity by implementing data validation and cleansing techniques Must have proficiency in the core concepts of data modeling, including understanding entities, relationships, keys, normalization, and various modeling techniques (e.g., ER diagrams, Star Schema, Snowflake Schema). Basic understanding of cloud technologies Power platform knowledge is an added advantage Programming background is an added advantage Responsibilities Responsibilities Spearhead data analytics initiatives, utilizing advanced data visualization and statistical techniques to unearth insights and streamline opportunities for growth. Architect and ensure precision in data design, metrics, and analytics distributed to interdisciplinary stakeholders. Implementing normalization techniques to streamline data into efficient, non-redundant structures that reduce anomalies. Maintain and customize dashboards and workflows using Power BI, Alteryx Integrate closely with Business Development, Operations, and other teams to have a robust understanding of processes, challenges, and customer needs. Define and visualize key performance indicators (KPIs) to measure and articulate project success and stakeholder engagement. Demonstrated ability to lead projects, influence teams, and mentor junior analysts Qualifications Education: Bachelor&aposs degree (BE/BTECH) in Computer Science, Engineering, Data Science, or related fields. Highly Preferable: Masters degree in data Analytics, Data Science, Statistics, or any relevant certification(s) such as CAP, DASCA, Microsoft Certified: Data Analyst Associate, or equivalent. #KGS Show more Show less

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About the Company: Relay Human Cloud is a young & dynamic company that helps some of the top US-based companies to expand their team internationally. Relay is a truly global company having its operations in US, India, Honduras, and Mexico (We are also adding a few more countries soon). Our core focus is to enable companies to connect with the best international talent. Relay helps its clients in majorly following areas: Accounting & Finance, Administration, Operations, Space Planning, Leasing, Data Science, Data Search, Machine Learning and Artificial Intelligence etc. Relay India operates from Ahmedabad and Vadodara offices. The principals and founders of Relay Human Cloud have been focused on delivering high-quality operations in cutting-edge companies for their entire careers. Job Overview: We are looking for a talented and dedicated Yardi Report Developer with strong experience in YSR reporting to work directly with our US-based clients. The Yardi Report Developer will be responsible for designing, developing, and maintaining custom reports and data visualization solutions within the Yardi property management software. This role is critical to our client&aposs ability to provide accurate and actionable insights to support decision-making and enhance their property management operations. Key Responsibilities: Develop and maintain custom YSR reports within the Yardi Voyager property management software. Collaborate with business stakeholders to understand their reporting and data visualization needs. Design and create dynamic and interactive reports and dashboards that provide valuable insights. Troubleshoot and resolve any issues related to report performance or data accuracy. Create and maintain documentation for YSR reports and processes for future reference. Stay up to date with Yardi software updates and new features and implement them as needed. Assist in data extraction, transformation, and loading (ETL) processes to support reporting requirements. Perform Ad-hoc data analysis and reporting tasks as requested by management. Provide training and support to end-users on YSR reporting capabilities and best practices. Qualifications: Proficient in English, as you will be working directly with US-based clients. Bachelor&aposs degree in computer science, Information Technology, or a related field (or equivalent work experience) Experienced in Yardi property management software for 23 years, with expertise in YSR reporting. Strong knowledge of SQL, data modeling, and data warehousing concepts Proficiency in report development tools and technologies, such as Yardi Voyager, YSR, SSRS, Power BI, or similar Excellent problem-solving and analytical skills Detail-oriented with the ability to ensure data accuracy and report quality. Self-motivated and able to work independently or as part of a team. Preferred Qualifications: Experience with real estate or property management industry Knowledge of ETL tools and processes Experience with data visualization best practices Show more Show less

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Solution Architect at DHL Group, a global logistics provider with a workforce of around 600,000 employees spanning over 220 countries and territories, you will play a pivotal role in designing, implementing, and optimizing analytics, data warehousing, and reporting solutions. Your expertise will be essential in ensuring that all solutions meet business requirements, adhere to performance benchmarks, and align with industry standards. Your responsibilities will include leading the design and implementation of analytics and data warehousing solutions, optimizing data pipelines and integrations for accurate and timely data analysis and reporting, conducting data modeling and design to enhance data quality and consistency, collaborating with project teams to define business requirements, and providing technical guidance to development teams, including coding and solution design. Additionally, you will monitor the performance of BI systems and propose improvements to enhance effectiveness while collaborating with cross-functional teams to drive innovation and enhance the organization's data capabilities. To excel in this role, you should have a minimum of 6 years of experience in IT, with at least 4 years in a solution architect role focused on analytics and data warehousing. Proficiency in data modeling, ETL processes, and analytics tools such as Power BI and Snowflake is required. Experience with cloud platforms like AWS and Azure, as well as familiarity with microservices architecture, will be beneficial. Strong analytical and problem-solving skills, excellent verbal and written communication skills, and the ability to explain complex technical concepts to non-technical stakeholders are essential. Experience working in Agile/Scrum environments with a collaborative approach to project delivery is also preferred. At DHL Group, we offer you the opportunity to join a leading global company, be part of a dynamic team, enjoy flexible working hours and remote work options, thrive in an international environment, and benefit from an attractive compensation and benefits package. Join us, make a positive impact, and build an amazing career with DHL Group.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

NTT DATA is looking for a talented and passionate individual to join as a Salesforce Data Cloud Specialist in Hyderabad, Telangana, India. As a Salesforce Data Cloud Specialist, you will be responsible for managing and optimizing customer data platforms within Salesforce ecosystems. You will work closely with stakeholders to ensure seamless integration and orchestration of data, aligning data models with business requirements to provide actionable insights. Key Responsibilities: - Implement and configure Salesforce Data Cloud to effectively unify and segment customer data. - Design and manage data models that seamlessly integrate with Salesforce platforms, ensuring high-quality data ingestion and transformation. - Collaborate with stakeholders to understand business requirements and translate them into technical solutions. - Build and manage data pipelines to aggregate and cleanse data from multiple sources. - Develop rules for data normalization, identity resolution, and deduplication. - Maintain data compliance, security, and privacy standards. - Collaborate with marketing, sales, and analytics teams to leverage Data Cloud capabilities for improved customer engagement and personalization. - Troubleshoot and optimize Data Cloud performance to ensure timely issue resolution. Required Skills and Qualifications: - Hands-on experience with Salesforce Data Cloud (formerly known as Customer Data Platform). - Proficiency in data modeling, ETL processes, and data integration within Salesforce ecosystems. - Knowledge of Salesforce CRM, Marketing Cloud, and related modules. - Experience with API integrations and data connectors. - Familiarity with identity resolution and customer segmentation techniques. - Strong understanding of data governance, privacy, and compliance requirements. - Analytical mindset with the ability to derive actionable insights from data. - Excellent communication and collaboration skills. Preferred Skills: - Salesforce certifications such as Salesforce Certified Data Cloud Specialist or related certifications. - Hands-on experience with SQL, Python, or other data manipulation tools. - Familiarity with AI/ML models for predictive analytics in customer data. Educational Qualifications: - Bachelors or Masters degree in Computer Science, Information Systems, or a related field. If you are looking to be part of a dynamic and innovative organization, apply now to join NTT DATA and contribute to our mission of helping clients innovate, optimize, and transform for long-term success.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

vadodara, gujarat

On-site

The role supports the integration, transformation, and delivery of data using tools within the Microsoft Fabric platform. You will collaborate with the data engineering team to provide Data and Insights solutions, ensuring the delivery of high-quality data to enable analytics capabilities within the organization. Your key responsibilities will include assisting in the development and maintenance of ETL pipelines using Azure Data Factory and other Fabric tools. You will work closely with senior engineers and analysts to gather requirements, develop prototypes, and support data integration from various sources. Additionally, you will play a role in developing and maintaining Data Warehouse schemas, contributing to documentation, and participating in testing efforts to uphold data reliability. It is crucial to learn and adhere to data standards and governance practices as directed by the team. Essential skills and experience for this role include a solid understanding of data engineering concepts and data structures, familiarity with Microsoft data tools like Azure Data Factory, OneLake, or Synapse, and knowledge of ETL processes and data pipelines. The ability to work collaboratively in an Agile/Kanban team environment is essential. Possessing a Microsoft certified Fabric DP-600 or DP-700 certification, along with any other relevant Azure Data certification, is advantageous. Desirable skills and experience include familiarity with Medallion Architecture principles, exposure to MS Purview or other data governance tools, understanding of data warehousing and reporting concepts, and an interest or background in retail data domains.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

You will be joining as an Oracle PL/SQL Developer with 2 to 4 years of experience, based in Hyderabad. Your primary responsibilities will include working with Oracle 11g/Oracle 12c, and leveraging advanced PL/SQL skills for package, procedures, functions, triggers, and batch coding. You should also have a strong grasp of performance tuning techniques using SQL Trace, Explain Plan, Indexing, and Hints. To excel in this role, you must possess a Bachelor's degree in Computer Science or a related field, along with at least 2 years of hands-on experience in Oracle PL/SQL development. A solid understanding of relational databases, proficiency in writing and optimizing PL/SQL code, and familiarity with database design and data modeling are essential requirements. Additionally, you should be well-versed in database performance tuning and optimization strategies, database backup and recovery processes, and version control systems like Git or SVN. Knowledge of data integration and ETL processes, along with experience in Agile/Scrum environments, will be advantageous. Strong analytical and problem-solving skills are crucial, along with the ability to collaborate effectively in a team setting. Excellent communication skills, both verbal and written, are highly valued. Certifications in Oracle technologies would be a plus, and familiarity with programming languages such as Java or Python is considered beneficial. If you meet these qualifications and have the required skills in database performance tuning, SQL querying, ETL processes, database design, and PL/SQL development, we encourage you to apply for this exciting opportunity.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

faridabad, haryana

On-site

We are seeking a skilled QA / Data Engineer with 3-5 years of experience. As the ideal candidate, you will possess expertise in manual testing and SQL, along with knowledge in automation and performance testing. Your primary responsibility will be to ensure the quality and reliability of our data-driven applications through comprehensive testing and validation. Key Responsibilities: - Utilize extensive experience in manual testing, particularly in data-centric environments. - Demonstrate strong SQL skills for data validation, querying, and testing database functionalities. - Implement data engineering concepts, including ETL processes, data pipelines, and data warehousing. - Work with Geo-Spatial Data to enhance data quality and analysis. - Apply QA methodologies and best practices for software and data testing. - Utilize effective communication skills for seamless collaboration within the team. Desired Skills: - Experience with automation testing tools and frameworks (e.g., Selenium, JUnit) for data pipelines. - Proficiency in performance testing tools (e.g., JMeter, LoadRunner) to evaluate data systems. - Familiarity with data engineering tools and platforms (e.g., Apache Kafka, Apache Spark, Hadoop). - Understanding of cloud-based data solutions (e.g., AWS, Azure, Google Cloud) and their testing methodologies. Qualifications: - Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) In this role, you will play a crucial part in ensuring the quality of our data-centric applications by conducting thorough testing and validation processes. Your expertise in manual testing, SQL, ETL processes, data pipelines, data warehousing, and additional skills in automation and performance testing will be key to your success. Join our team in Bengaluru/Gurugram and contribute to the reliability and efficiency of our data-driven solutions.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an SME_IMAL with 3-5 years of experience, you will be responsible for demonstrating a strong understanding of IMal functionality and database schema. Your expertise in IMal operations, relational database management systems (RDBMS), SQL, ETL processes, data mapping, and data transformation scripting will be crucial for ensuring data accuracy and consistency. Familiarity with Islamic banking principles and financial operations will also be required for this role. Your role will involve collaborating with stakeholders, facilitating seamless transformations, and ensuring the accuracy and consistency of data. Your contact for further information is 9902084678. Thank you for your interest in this position.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies