Jobs
Interviews

1052 Etl Processes Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 12.0 years

3 - 15 Lacs

Bengaluru, Karnataka, India

On-site

Looking for an experienced Project Manager or Delivery Manager to lead a critical migration project, transitioning approximately 100 applications from Power Center on-premises to SaaS. Engagement Details: Start date: ASAP Location: Bangalore, in Swiss Re office Key Responsibilities: Drive the initial migration phase to Cloud Data Integration for PowerCenter (CDI-PC) Lead the creation of project artifacts according to Swiss Re standard TS4PM Maintain momentum through structured progress tracking and regular stakeholder updates Collaborate closely with internal teams and manage vendor contributions Candidate Profile: 10+ years of project/delivery management experience Proven track record in PowerCenter migration projects, ideally having led a similar initiative Strong organizational and communication skills Results-driven and able to work with minimal oversight

Posted 1 month ago

Apply

12.0 - 14.0 years

4 - 8 Lacs

Bhubaneswar, Odisha, India

On-site

Summary: As a Technology OpS Support Practitioner, you will be responsible for maintaining the integrity and governance of systems while following best practices for service delivery. Your role involves developing, deploying, and supporting infrastructures, applications, and technology initiatives in line with organizational standards and delivery methods. Collaboration with various teams will be key to ensuring operational excellence and driving successful technology initiatives. Roles & Responsibilities: Serve as a Subject Matter Expert (SME) with deep expertise in Informatica PowerCenter. Influence and advise teams on technical and operational decisions. Take ownership of team decisions and outcomes. Collaborate with multiple teams and contribute to key technical and strategic decisions. Provide effective solutions to complex problems impacting multiple teams. Conduct training sessions and workshops to build and enhance team capabilities. Monitor implemented solutions, evaluate their effectiveness, and adjust as necessary. Professional & Technical Skills: Proficiency in Informatica PowerCenter. Strong understanding of data integration and ETL (Extract, Transform, Load) processes. Experience with data warehousing concepts and methodologies. Familiarity with database management systems and SQL. Skilled in troubleshooting and resolving technical issues efficiently. Additional Information: Minimum 15 years of experience in Informatica PowerCenter. Position based in Mumbai office. Educational qualification: 15 years of full-time education required.

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 15 Lacs

Hyderabad, Telangana, India

On-site

Skills required: Overall experience of 5 years in DW/ BI technologies and minimum 5 years development experience in ETL DataStage 8.x/ 9.x tool. Worked extensively in parallel jobs, sequences and preferably in routines. Good conceptual knowledge on data- warehouse and various methodologies. Strong SQL database skills in Teradata and other databases like Oracle, SQL Server, DB2 etc. Working knowledge in UNIX shell scripting. Good communication and presentation skills Should be flexible with the overlapping working hours as he/ she needs to closely work with the US/ UK clients Should be able to work independently and act proactively

Posted 1 month ago

Apply

5.0 - 9.0 years

3 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Job description Group/Division The Information Technology (IT) group at KLA is involved in every aspect of the global business. IT s mission is to enable business growth and productivity by connecting people, process, and technology. It focuses not only on enhancing the technology that enables our business to thrive but also on how employees use and are empowered by technology. This integrated approach to customer service, creativity and technological excellence enables employee productivity, business analytics, and process excellence.Job Description Experience in Data warehousing - Dimensions, Facts, and Data modeling. Experience gathering and analyzing system requirements Demonstrate expertise in Snowflake data modeling and ELT using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts Assess and understand the ETL jobs, workflows, BI tools and reports Responsible for end-to-end data analytics design Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, stored procedures, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing In-depth understanding of Data Warehouse, ETL concept and modeling structure principles Ability to effectively function in a cross teams environment Strong analytical, quantitative, problem solving and organizational skills Good working knowledge of any ETL tool (Informatica, ODI etc.,) Good to have familiarity with data visualization tools (Tableau/Power BI) Good to have exposure to AWS / Azure Data ecosystem Good to have exposure to HANA, SAP CRM Minimum Qualifications Degree in Data Analytics, Information Systems, or a related field. 5 Plus years of experience in data analysis, business analysis, or a related strategy role.

Posted 1 month ago

Apply

5.0 - 7.0 years

6 - 11 Lacs

Bengaluru, Karnataka, India

On-site

Mandatory Skills: Diffusion Model Optimization at scale Multi modal alignment Fine tuning LLMs with Reinforcement Learning from Human Feedback (RLHF) Sparse Attention Mechanism Engineering Training foundational models from scratch Latent space manipulation & controllability in Generative Models Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field. 5 7 years of relevant experience in software development. Demonstrated experience in working with generative AI technologies. Excellent communication and interpersonal skills. Ability to adapt to changing priorities and work under tight deadlines.

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bhubaneswar, Odisha, India

On-site

We are seeking a skilled Data Engineer to design, develop, and maintain robust data solutions that enable efficient data generation, collection, and processing. The role involves creating scalable data pipelines, ensuring data quality, and implementing ETL processes to support seamless data migration and deployment across systems. You will collaborate with cross-functional teams to understand business data needs and troubleshoot issues to maintain smooth data flow and processing. Roles and Responsibilities: Design, develop, and maintain data pipelines and ETL processes Ensure data quality and reliability across various data sources and systems Collaborate with cross-functional teams to understand data requirements and deliver solutions Analyze and model client, market, and key performance data to generate business insights Work on data integration and migration projects ensuring efficient data flow Develop and optimize BigQuery SQL scripts for performance tuning Utilize cloud-native services such as Google Cloud Storage, BigQuery, Cloud Functions, Pub/Sub, Composer, and Kubernetes Automate CI/CD pipelines using tools like Azure DevOps, GitHub, JIRA, and Confluence Write clean, efficient Python code using libraries such as Pandas and NumPy Troubleshoot and resolve data pipeline issues independently or as part of a team Work effectively in an Agile development environment Mentor and guide junior team members when needed Professional and Technical Skills: Expertise in SQL and Python programming (No Flex) Strong hands-on experience with Google BigQuery and BigQuery SQL performance tuning Proficiency with cloud-native data platform services on GCP (Google Cloud Platform) Experience with Shell scripting, Oracle, and various data structures (dictionary, array, list, tree) Knowledge of automated CI/CD pipelines and tools (Control-M, GitHub, Azure DevOps) Familiarity with Agile methodologies and collaborative teamwork GCP certification preferred Professional Attributes: Good communication skills Strong analytical and problem-solving abilities Ability to work independently and in a team environment Capability to manage and handle teams

Posted 1 month ago

Apply

5.0 - 9.0 years

6 - 11 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities: Data Migration : Lead the data migration efforts , including extraction, transformation, and loading (ETL) processes. Execute data migration activities between different systems using tools like Kingsway Soft , Scribe , and SSRS packages . ETL Processes : Oversee the entire ETL pipeline to ensure smooth data transfers, data integrity, and data cleansing. Implement best practices for data migration and troubleshoot any issues during the process. SQL Expertise : Use PL/SQL and advanced SQL to write complex queries for data extraction, transformation, and loading. Perform database operations and manage large datasets efficiently. CRM Knowledge : Leverage your knowledge of Dynamics 365 and Salesforce CRM to ensure data integrity and consistency during migration and integration. Documentation & Reporting : Maintain comprehensive documentation of the migration processes, including all SSRS reports and data transformations . Create SSRS packages for reporting and data analysis. Required Qualifications: Experience : 7+ years of hands-on experience in data migration and ETL processes . Tools & Technologies : Expertise in Kingsway Soft , Scribe , and writing SSRS packages . SQL : Strong experience in PL/SQL and advanced SQL . Database Operations : Proficient in managing DB operations and large-scale database environments. CRM Knowledge : Experience with Dynamics 365 and Salesforce CRM .

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for developing/enhancing objects in Oracle Data Integrator (ODI), RPD, and OACS Reports & Analyses. Your role will involve providing end-to-end solution architecture for integrating Oracle ERP and SCM Cloud applications using ODI and OACS. You will be tasked with ODI ELT Data Load Scheduling, Monitoring, and Troubleshooting, as well as applying data fixes. Furthermore, you will need to coordinate with the onshore team by setting day-to-day execution targets. Your expertise will be crucial in integration architecture design, solutions selection, implementation, and continuous technical improvement. Collaboration with a cross-functional team to resolve issues and guide team members to successful delivery will also be part of your responsibilities. To qualify for this role, you must have at least 5 years of demonstrable hands-on development experience using standard SDLC and/or Agile methodologies in an ODI/OACS/PLSQL developer role. Additionally, you should have 5+ years of experience in extracting data from Oracle ERP/SCM. Proficiency in developing ETL processes, such as ETL control tables, error logging, auditing, data quality, etc., and implementing reusability, parameterization, workflow design, is essential. Expertise in the Oracle ODI 12c toolset, Oracle PL/SQL, RPD, OACS, BICC, BI Publisher, as well as knowledge of data modeling and ETL design, are required. You should be able to integrate ODI with multiple sources and targets and have experience in error recycling/management using ODI. Strong knowledge of database objects development (SQL/PLSQL) and ELT/ETL concepts, design, and coding is expected. Moreover, expert knowledge of OBIEE/OAC RPD design and BI analytics reports design is necessary. Familiarity with BI Apps will be an advantage. Experience in creating PL/SQL packages, procedures, functions, triggers, views, materialized views, and exception handling for retrieving, manipulating, checking, and migrating complex datasets in Oracle is crucial. Your role will also involve devising partitioning and indexing strategies for optimal performance, leading support & development projects, and possessing good verbal and written communication skills in English. Strong interpersonal, analytical, and problem-solving abilities are essential. Any certifications in ODI or OACS will be a plus. Good knowledge of Oracle database and development experience in database applications, along with traits such as creativity, personal drive, influencing and negotiating skills, and problem-solving capabilities, are desired attributes for this role.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As a Senior Data Analyst at Gibbous Technologies, a data analytics consulting firm specializing in sports business analytics and advanced business intelligence solutions, you will be responsible for leading analytics projects, mentoring junior analysts, and collaborating closely with clients to deliver tailored solutions. Your role will involve designing, executing, and delivering analytics projects from data collection to visualization, conducting deep-dive analyses to identify trends and actionable insights, and developing dashboards, reports, and data models to support decision-making. You will work with stakeholders to translate business questions into analytical problems and solutions, ensure data quality, integrity, and compliance with best practices, and collaborate with cross-functional teams to integrate analytics into business workflows. Additionally, you will mentor and guide junior analysts in technical skills and analytical approaches, leveraging your strong analytical expertise, passion for data storytelling, and experience with diverse datasets to deliver impactful insights for clients. The ideal candidate for this role will have a Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Information Systems, or a related field, along with 4+ years of professional experience in data analytics, preferably in consulting or business intelligence roles. Proficiency in SQL and at least one analytics-oriented programming language (Python/R), experience with data visualization tools (e.g., Power BI, Tableau, Looker), and a solid understanding of statistical analysis, hypothesis testing, and predictive modeling are required. Excellent communication and presentation skills, the ability to simplify complex findings, and proven leadership in analytics initiatives are also essential. Preferred qualifications for this role include experience in sports analytics or business analytics for enterprises, knowledge of cloud-based data platforms (Snowflake, BigQuery, AWS Redshift, etc.), and familiarity with ETL processes and data pipeline development. In return, Gibbous Technologies offers a competitive salary based on experience and expertise, the opportunity to work on diverse and impactful projects, a collaborative and innovation-driven work culture, and professional growth and learning opportunities. To apply for the Senior Data Analyst position in Nagpur at Gibbous Technologies, please send your resume and a brief cover letter to shreyas@gibbous.io with the subject line: Application Senior Data Analyst (Nagpur). Join our team and be part of a dynamic environment where you can make a significant impact through data analytics and business intelligence.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

vadodara, gujarat

On-site

As a Lead Data Engineer at Rearc, you will play a crucial role in establishing and maintaining technical excellence within our data engineering team. Your extensive experience in data architecture, ETL processes, and data modeling will be key in optimizing data workflows for efficiency, scalability, and reliability. Collaborating closely with cross-functional teams, you will design and implement robust data solutions that align with business objectives and adhere to best practices in data management. Building strong partnerships with technical teams and stakeholders is essential as you drive data-driven initiatives and ensure their successful implementation. With over 10 years of experience in data engineering or related fields, you bring a wealth of expertise in managing and optimizing data pipelines and architectures. Your proficiency in Java and/or Python, along with experience in data pipeline orchestration using platforms like Airflow, Databricks, DBT, or AWS Glue, will be invaluable. Hands-on experience with data analysis tools and libraries such as Pyspark, NumPy, Pandas, or Dask is required, while proficiency with Spark and Databricks is highly desirable. Your proven track record of leading complex data engineering projects, coupled with hands-on experience in ETL processes, data warehousing, and data modeling tools, enables you to deliver efficient and robust data pipelines. You possess in-depth knowledge of data integration tools and best practices, as well as a strong understanding of cloud-based data services and technologies like AWS Redshift, Azure Synapse Analytics, and Google BigQuery. Your strategic and analytical skills will enable you to solve intricate data challenges and drive data-driven decision-making. In this role, you will collaborate with stakeholders to understand data requirements and challenges, implement data solutions with a DataOps mindset using modern tools and frameworks, lead data engineering projects, mentor junior team members, and promote knowledge sharing through technical blogs and articles. Your exceptional communication and interpersonal skills will facilitate collaboration with cross-functional teams and effective stakeholder engagement at all levels. At Rearc, we empower engineers to build innovative products and experiences by providing them with the best tools possible. If you are a cloud professional with a passion for problem-solving and a desire to make a difference, join us in our mission to solve problems and drive innovation in the field of data engineering.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

kochi, kerala

On-site

As a Data Analyst, you will be responsible for analyzing large volumes of analytical data (sales, customer, product) to identify patterns, trends, and insights. Your key responsibilities will include developing and maintaining data reports and dashboards using business intelligence tools such as Tableau, Power BI, etc. You will collaborate with business teams to understand key business objectives and data needs, analyze customer behavior, product usage, and historical purchase data to identify cross-selling opportunities. Ensuring that data models and structures are well-designed to facilitate analysis and reporting will be crucial. Additionally, you will act as a bridge between data engineers and business stakeholders, ensuring data is aligned with business needs. To excel in this role, you should be proficient in data engineering tools and languages like Python, SQL, and Java. A strong knowledge of business intelligence tools such as Tableau, Power BI, or similar for creating reports and dashboards is essential. You should have the ability to work with databases (SQL & NoSQL) and perform complex data queries. Familiarity with data wrangling, ETL processes, and data cleaning techniques is required. A clear understanding of data warehousing & cloud-based analytics will be beneficial. Furthermore, your ability to apply data insights to influence marketing strategies, sales tactics, and product offerings will be valuable. Ideally, you should have 2-4 years of experience in a data analyst role with a track record of analyzing customer data, sales data, and product data. If you are looking for a challenging opportunity where you can leverage your analytical skills and contribute to impactful business decisions, we encourage you to apply now.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Senior Solution Architect specializing in Database Technology, you will play a crucial role in architecting, designing, and implementing complex database solutions for our enterprise clients. Your expertise in both relational and non-relational database systems will be instrumental in creating scalable, secure, and high-performance solutions that align with our clients" business requirements. Your primary responsibilities will include leading the design and implementation of database solutions, considering factors such as scalability, performance, availability, and security. You will provide expert guidance on the selection and integration of database technologies based on client needs, develop detailed system architecture blueprints, and ensure the delivery of high-performance, reliable database architectures tailored to specific business use cases. Collaboration with business stakeholders, technical teams, and project managers will be essential to translate business requirements into technical solutions effectively. You will also provide consulting to clients on best practices for database management, optimization, and scaling, as well as support pre-sales activities through technical presentations and scoping project deliverables. Staying abreast of emerging database technologies and trends will be part of your role, allowing you to recommend innovative solutions and mentor junior architects and developers. Your leadership in troubleshooting, performance optimization, and adherence to defined service-level agreements will be critical to ensuring the success of database systems across projects. In addition, you will oversee the implementation of database solutions, manage relationships with third-party database vendors, contribute to the project roadmap, and ensure compliance with industry standards and regulations such as GDPR and HIPAA. Your technical expertise should encompass relational and NoSQL databases, cloud-based database platforms, database design, optimization, and scaling for various workloads, as well as familiarity with data warehousing, ETL processes, and data integration technologies. Your proven experience in architecting large-scale, high-performance database systems, coupled with strong knowledge of data modeling, schema design, and database performance tuning, will be valuable assets in this role. Proficiency in database automation, monitoring, and management tools, experience with containerization and orchestration tools, and familiarity with DevOps practices for database deployments are also desired skills. Strong leadership, mentoring, communication, and presentation skills will be essential for guiding cross-functional teams and effectively communicating technical concepts to diverse stakeholders. Moreover, holding certifications such as Oracle Certified Architect or cloud certifications like AWS Certified Solutions Architect or Microsoft Certified: Azure Solutions Architect Expert will be advantageous in demonstrating your expertise in database technology.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You will be working as a Data Engineer with 5~8 years of hands-on experience in Coimbatore, Hyderabad, or remotely. Your responsibilities will include utilizing the Databricks environment with PySpark to design, develop, and maintain scalable data pipelines. It is essential to have strong Data Analysis skills and experience in implementing ETL processes to ensure data quality and performance. Knowledge of Data warehousing concepts, data modeling, and metadata management will be advantageous. Additionally, good communication skills, especially customer interfacing skills, are required for this role.,

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for a highly motivated and skilled Salesforce Data Cloud Architect to design, develop, and optimize Data Cloud data model and use cases. The successful candidate will work closely with cross-functional teams, including Marketing, IT, and Data Analytics, to bring dynamic content and tailored experiences to our customers. Job Description: Key Responsibilities: Lead the end-to-end implementation of Salesforce Data Cloud (CDP), including data acquisition, integration, quality assurance, and utilization. Configure and implement data-driven segmentation strategies, ensuring accurate audience targeting and content delivery. Design, document and implement data models, data pipelines, and transformations to support data ingestion, integration, and enrichment within Salesforce Data Cloud. Be curious and up-to-speed with the fast-paced releases coming to the platform. Collaborate with IT teams to ensure seamless data integration, troubleshoot technical issues, and optimize system performance for data initiatives. Integrate data from various sources, including CRM systems, databases, and third-party platforms, to support marketing and personalization efforts. Provide training and support to development teams on utilizing Salesforce Data Cloud features and capabilities. SFDC Skills: Configure and optimize Data Cloud platform to meet business needs. Integration of Data Cloud with Salesforce CRM Salesforce MC Salesforce Marketing Intelligence Websites/microsites - using SDK method (setup connectors, sitemaps, schema, data streams) Snowflake Other sources Set up and manage data streams from various sources, ensuring seamless data flow Configure and develop criterias for Identity management, data transformations and calculated insights Configure and develop Lead scoring based on customer data from CRM and engagement data from different touchpoint such as website, MC engagement using data transformations and calculated insights Configure data transformations for data lake objects Develop and maintain data models that enhance data quality and usability. Assist in the creation of customer segments and support marketing and analytics teams. Monitor the platform to detect and resolve disruptions or errors promptly Qualifications: Bachelor&aposs degree in Computer Science, Information Technology, Marketing, or a related field. Proven experience (2+ years) with a strong focus on Salesforce Data Cloud and/or custom database solutions Salesforce Data Cloud Consultant certification strongly preferred. Strong understanding of marketing automation, data segmentation, and personalized customer journeys, decisioning and Next Best Action. Experience with data integration and API utilization. Expertise in data modeling, ETL processes, data integration tools, and SQL. Experience with customer data platforms (CDPs) and data management practices, including data governance and compliance. Excellent problem-solving skills and attention to detail, with the ability to troubleshoot operational challenges. Familiarity with cloud technologies (e.g., AWS, Azure, GCP) and data modeling/scoring technologies. Strong communication and collaboration skills, with the ability to work effectively across cross-functional teams. Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead BI Engineer, you will be responsible for day-to-day tasks involving Extract, Transform, Load (ETL) processes, data integration, data modeling, and analytical skills, mentoring junior developers. You will help bring rigor and discipline in day-to-day operations & production supports. It is essential to have the ability to work in a fast-paced, high-energy environment and bring a sense of urgency & attention to detail to the table. Coordinating closely with other BI team members to help ensure meaningful prioritization is a key aspect of this role. You will be expected to escalate potential issues in a timely fashion and seek paths for resolution. Excellent communication skills and the ability to manage expectations are crucial for success in this position. Responsibilities will include designing, developing, and maintaining ETL processes using Informatica IICS (Integration Cloud Services) and IDMC (Intelligent Data Management Cloud). Ensuring efficient data extraction, transformation, and loading from various source systems is a primary task. Working with modern data warehousing platforms, including Snowflake, to build schemas, SCDs, hierarchy flattening, and profiling will be part of your role. Writing complex SQL queries to extract, transform, and load data efficiently is a core requirement. Collaboration with data engineers and data scientists, leveraging platforms like Databricks for data exploration, transformation, and machine learning is essential. Creating advanced Power BI dashboards and reports to visualize data insights and exploring opportunities to integrate AI/ML models into BI solutions for predictive analytics are also part of the responsibilities. Leading and mentoring ETL developers, ensuring efficient scheduling, and load balancing will be a key aspect of resource management. Coordinating with business stakeholders and cross-functional teams for design, testing, and validations will also be a part of the role. Qualifications Required Skills and Experiences: - Bachelor's degree in computer science, Engineering, Business, or related fields. - Minimum 10+ years of experience in BI development and data analytics. - Proven track record of successful project delivery. - Strong Experience with cloud-based data solutions. - Proficiency in SQL and ETL processes. - Proficiency in IICS/IDMC. - Familiarity with Snowflake and/or Databricks, and Power BI. - Understanding of AI/ML concepts. - Strong customer-facing, conflict resolution, and negotiation skills. - Strong analytical skills. Preferred Skills: - Masters degree in Computer Science, Engineering, Business, or related fields.,

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for a highly skilled Technical Data Analyst to join our growing team. As a Technical Data Analyst, you will need to have a strong technical foundation in Oracle PL/SQL and Python, along with expertise in data analysis tools and techniques. The ideal candidate should be a strategic thinker with the ability to lead and mentor a team of data analysts, driving data-driven insights and contributing to key business decisions. You will also be responsible for researching and evaluating emerging AI tools and techniques for potential application in data analysis projects. Your responsibilities will include designing, developing, and maintaining complex Oracle PL/SQL queries and procedures for data extraction, transformation, and loading (ETL) processes. You will use Python scripting for data analysis, automation, and reporting. Performing in-depth data analysis to identify trends, patterns, and anomalies to provide actionable insights for improving business performance will also be part of your role. Collaboration with cross-functional teams to understand business requirements and translating them into technical specifications is crucial. You will develop and maintain data quality standards and ensure data integrity across various systems. Additionally, you will leverage data analysis and visualization tools such as Tableau, Power BI, and Qlik Sense to create interactive dashboards and reports for business stakeholders. Staying up-to-date with the latest data analysis tools, techniques, and industry best practices, including AI/ML advancements, will be essential. Researching and evaluating emerging AI/ML tools and techniques for potential application in data analysis projects will also be part of your responsibilities. Preferred qualifications for this role include hands-on work experience as a Technical Data Analyst with expertise in Oracle PL/SQL and Python programming, proficiency in Python scripting for data analysis and automation, expertise in data visualization tools such as Tableau, Power BI, or Qlik Sense, awareness and understanding of AI/ML tools and techniques in data analytics, and practical experience applying AI/ML techniques in data analysis projects. Strong analytical, problem-solving, communication, and interpersonal skills are required, along with experience in the financial services industry. Qualifications for this position include 0-2 years of relevant experience, experience in programming/debugging used in business applications, working knowledge of industry practice and standards, comprehensive knowledge of a specific business area for application development, working knowledge of program languages, and consistently demonstrating clear and concise written and verbal communication. Education requirement for this position is a Bachelor's degree/University degree or equivalent experience. Please note that this job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You will be joining a dynamic team dedicated to delivering custom and standard reports to clients using cutting-edge data visualization and reporting tools, primarily Power BI and SQL. As a talented and motivated Business Intelligence Analyst / Data Analyst, your key responsibilities will include designing efficient data models and relationships, collaborating with data engineering teams, optimizing DAX calculations and queries, resolving issues in data modeling and reporting, utilizing AI tools for productivity enhancement, automating reporting processes, and providing insights and recommendations based on data analysis to support business objectives. You should possess 4-8 years of business intelligence or data analytics experience with a must-have experience in Databricks. Additionally, you should have proven expertise in Power BI data modeling (Import, Direct Query, Composite models), strong skills in DAX, Power Query (M Language), and SQL, experience in big data reporting and handling large-scale datasets efficiently, ability to work independently with minimal supervision, strong problem-solving skills, experience with AI tools for efficiency improvement, knowledge of data warehousing concepts and ETL processes, and excellent communication and collaboration skills.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You are seeking an experienced SAS professional to lead and execute the migration of existing SAS Enterprise Guide (EG) programs and processes to the modern SAS Viya platform. You should have strong expertise in both SAS EG and SAS Viya environments and will be responsible for ensuring a smooth transition while optimizing code and processes. Your key responsibilities will include assessing existing SAS EG programs, developing and implementing migration strategies and frameworks, converting SAS EG programs to SAS Viya compatible code, optimizing existing code for better performance in the Viya environment, creating and maintaining documents for migration processes and procedures, providing training and support to team members during the transition, collaborating with stakeholders to ensure business requirements are met, performing testing and validation of migration programs, troubleshooting migration-related issues. For migration planning, you will need to analyze the current SAS EG environment and applications, create a detailed migration roadmap, identify potential risks and mitigation strategies, establish a timeline and milestones. In terms of technical implementation, you will convert SAS EG programs to Viya-compatible format, optimize code for CAS processing, implement new features available in Viya, ensure data security and access controls. You will also be responsible for quality assurance by developing testing strategies, performing parallel runs, validating results, and documenting any discrepancies. Knowledge transfer will involve creating training materials, conducting workshops, providing ongoing support, and documenting best practices. Monitoring and maintenance will require tracking the migration process, monitoring performance, addressing issues and concerns, and providing regular status updates. The work environment is a full-time position that may require occasional overtime during critical migration phases. It is a hybrid work environment with a mix of remote and office work and may require some travel to different office locations. You should have technical skills in SAS Base Programming, SAS Enterprise Guide, SAS Viya, SAS Studio, SAS Visual Analytics, CAS Programming, Git version control, Data Modelling, and ETL processes. Soft skills required include strong analytical and problem-solving abilities, excellent communication skills, team collaboration, project management, time management, documentation skills, and training and mentoring abilities. The ideal candidate will have a Bachelor's degree in Computer Science, Statistics, or a related field, with 5+ years of experience in SAS programming. You should have strong expertise in SAS Enterprise Guide, hands-on experience with SAS Viya platform, proficiency in SAS Studio and Visual Analytics, knowledge of CAS (Cloud Analytics Service), experience with REST APIs and web services, a strong understanding of data management principles, and experience in working in dual shore engagement is preferred. Preferred qualifications include SAS Certifications, experience with cloud platforms (AWS, Azure, GCP), knowledge of Python or R programming, project management experience, experience with Agile methodologies, and previous migration project experience. EXL Analytics offers an exciting, fast-paced, and innovative environment where you can work closely with highly experienced analytics consultants. You can expect to learn many aspects of businesses that clients engage in, effective teamwork, and time-management skills. The company invests heavily in training employees in all aspects of analytics and leading analytical tools and techniques. Guidance and coaching are provided through a mentoring program wherein every junior-level employee is assigned a senior-level professional as advisors. The unique experiences gathered at EXL Analytics set the stage for further growth and development in the company and beyond. "EOE/Minorities/Females/Vets/Disabilities",

Posted 1 month ago

Apply

2.0 - 4.0 years

3 - 8 Lacs

Bengaluru, Karnataka, India

On-site

Job description Overview Job Title: Global Data Analyst (Media Monitoring & Insights) Job Summary: The candidate will be responsible for media monitoring, data processing, and insights generation for global automotive markets, focusing on SOV (Share of Voice) analysis. The role requires expertise in data wrangling, ETL processes, report generation, and visualization using R, Excel, and Power BI. Additionally, the analyst will work closely with cross-functional teams to enhance data quality and optimize workflows. Responsibilities Key Responsibilities: Conduct global SOV analysis for the automotive sector, focusing on market trends and competitive benchmarking. Extract, clean, and transform data using R and Excel to ensure accurate reporting. Automate data visualization processes using Power BI to deliver interactive dashboards. Identify and resolve discrepancies in data, particularly for regions like Canada, India, the USA, and the Middle East. Perform ETL operations to streamline data delivery and optimize processing efficiency. Collaborate with marketing science, research, and local market teams to validate data accuracy. Develop and maintain reports on automotive market share trends, brand performance, and media insights. Work on historical data corrections and enhancements to improve reporting consistency. Support global data initiatives by integrating multiple data sources for enhanced analytics. Provide insights and recommendations based on data trends to assist in media planning and strategy. Qualifications Required Skills & Software Knowledge: Data Analysis & ETL: Strong experience in data extraction, transformation, and loading (ETL) processes. Programming: Proficiency in R for data manipulation, cleaning, and automation. Excel: Advanced Excel skills, including VBA, macros, pivot tables, and data wrangling techniques. Power BI: Experience in designing and maintaining interactive dashboards for data visualization. Media & Marketing Analytics: Understanding of SOV analysis, competitive intelligence, and media performance metrics. Database Management: Familiarity with managing structured and unstructured datasets. Collaboration Tools: Experience working with SharePoint, Kiteworks, and other collaboration platforms. Quality Assurance: Ability to validate and ensure data accuracy before final reporting.

Posted 1 month ago

Apply

10.0 - 15.0 years

10 - 15 Lacs

Bengaluru, Karnataka, India

On-site

Role Overview: The Big Data Architect will be responsible for the design, implementation, and management of the organizations big data infrastructure. The ideal candidate will have a strong technical background in big data technologies, excellent problem-solving skills, and the ability to work in a fast-paced environment. The role requires a deep understanding of data architecture, data modeling, and data integration techniques. About the Role: Design and implement scalable and efficient big data architecture solutions to meet business requirements. Develop and maintain data pipelines, ensuring the availability and quality of data. Collaborate with data scientists, data engineers, and other stakeholders to understand data needs and provide technical solutions. Lead the evaluation and selection of big data tools and technologies. Ensure data security and privacy compliance. Optimize and tune big data systems for performance and cost-efficiency. Document data architecture, data flows, and processes. Stay up to date with the latest industry trends and best practices in big data technologies. About You: Bachelors or master's degree in computer science, Information Technology, or a related field. Overall 10+ years exp with 5+ years of experience in big data architecture and engineering. Proficiency in big data technologies such as Hadoop MapReduce, Spark batch and streaming, Kafka, HBase, Scala, Elastic Search and others. Experience with AWS cloud platform. Strong knowledge of data modeling, ETL processes, and data warehousing. Proficiency in programming languages such as Java, Scala, Spark Familiarity with data visualization tools and techniques. Excellent communication and collaboration skills. Strong problem-solving abilities and attention to detail.

Posted 1 month ago

Apply

12.0 - 15.0 years

12 - 15 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking an experienced Senior Solution Architect specializing in Database Technology, you will be responsible for architecting, designing, and implementing complex database solutions for our enterprise clients. You will leverage your expertise in both relational and non-relational database systems to create scalable, secure, and high-performance solutions that meet our clients business needs. The ideal candidate will have deep technical knowledge of various database platforms, a strong understanding of cloud-based database solutions, and the ability to work closely with stakeholders to ensure optimal architecture decisions. Technical Expertise: In-depth experience with relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB). Strong understanding of cloud-based database platforms (AWS RDS, Azure SQL Database, Google Cloud SQL, etc.). Expertise in database design, optimization, and scaling for both transactional and analytical workloads. Familiarity with data warehousing, ETL processes, and data integration technologies.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Data Analytics role Senior Level Profile Experience Level: Senior (4+ years) Department: Data & Analytics / Business Intelligence Key Responsibilities: Design and implement enterprise-grade Power BI solutions, including complex dashboards, reports, and data models. Lead data modeling efforts using star/snowflake schemas and best practices in Power BI and SQL-based environments. Write optimized and complex SQL queries to extract, transform, and load data from various data sources. Collaborate with data engineers, architects, and business stakeholders to design efficient and scalable BI architectures. Define and implement data governance, data quality, and security (e.g., Row-Level Security in Power BI). Work closely with business users to gather and refine requirements, ensuring dashboards meet user expectations and business objectives. Optimize Power BI performance through proper data modeling, query optimization, and incremental data refresh. Act as a mentor to junior BI developers, conducting code reviews and promoting best practices. Stay up to date on emerging trends in data analytics and visualization tools, particularly within the Microsoft ecosystem. Required Skills: 4+ years of hands-on experience in Power BI development with advanced knowledge of Power Query (M language) and DAX. Expert-level experience in SQL and working with relational databases (e.g., SQL Server, Azure SQL, Oracle). Deep understanding of data modeling techniques - normalized, denormalized, star schema, snowflake schema, etc. Strong experience working with large datasets, building reusable datasets, and optimizing Power BI performance. Knowledge of ETL processes, data integration, and data pipeline design (experience with tools like SSIS, Talend, or Azure Data Factory is a plus). Exposure to cloud data platforms (Azure Synapse, Azure Data Lake, etc.) and Power Platform tools (Power Apps, Power Automate) is a plus. Excellent problem-solving, communication, and stakeholder management skills. Ability to present complex data in a clear and concise manner to non-technical audiences. Ability to work independently and as part of a team in a fast-paced environment. Preferred Qualifications: Microsoft Certified: Power BI Data Analyst Associate (PL-300). Experience with additional data analytics or visualization tools (e.g., Tableau, Qlik) is a plus. Experience in Agile/Scrum environments. Familiarity with scripting languages (e.g., Python) or advanced analytics tools is a plus. Prior experience in domain-specific analytics (e.g., finance, healthcare, supply chain) is an advantage. Work Environment: This position requires the candidate to log in from the nearest office for at least 2 days a week as part of the latest Return to Office mandate, which may be amended as required. Log in from the base office (Kochi/Trivandrum) once a quarter if requested by management. The role may involve short-term business travel to the MENA region for project-related work, with durations ranging from 1 to 3 months as needed. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As a Junior Data Engineer at Capgemini Engineering, you will play a crucial role in the development and maintenance of scalable data pipelines and infrastructure. Your main focus will involve integrating and optimizing data workflows using Cognite Data Fusion and Python to ensure seamless data accessibility and analysis throughout the organization. Your responsibilities will include designing, implementing, and optimizing end-to-end data pipelines for processing large volumes of structured and unstructured data. You will utilize Cognite Data Fusion to automate the contextualization of various data sources, ensuring efficient integration and accessibility. Additionally, you will develop robust ETL processes and data workflows using Python, collaborate with cross-functional teams to understand data requirements, and implement data quality assurance measures to maintain accuracy and reliability. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field, and demonstrate proficiency in Python programming. Knowledge of data integration platforms, data modeling, database design, and SQL is essential. Experience with cloud platforms like AWS or Azure, along with problem-solving skills, attention to detail, and excellent communication abilities, will be beneficial. Preferred qualifications include experience with data pipeline and workflow management tools, knowledge of big data technologies, and familiarity with data visualization tools like Grafana. Join us at Capgemini Engineering and be part of a dynamic team dedicated to leveraging data-driven insights to drive business strategies.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be located in Chennai, Tamil Nadu and will be offered a competitive salary. Ignitho Inc., a prominent AI and data engineering company with a global presence, is looking for a Senior Power BI Developer with 2 to 5 years of experience. As a Senior Power BI Developer at Ignitho, you will play a crucial role in designing, developing, and maintaining interactive reports and dashboards to facilitate data-driven decision-making by business users. Your responsibilities will include designing and developing robust Power BI reports and dashboards aligned with business objectives, building complex semantic models, utilizing DAX and Power Query for high-performance BI solutions, integrating data from multiple databases, developing reports with custom visuals and interactive elements, and implementing data loading through XMLA Endpoints. You will also design and manage Power Automate flows, create and maintain Paginated Reports, integrate advanced analytics tools like Python and R within Power BI, apply SQL skills and ETL processes for data transformation, and follow Agile methodologies for BI solution delivery. Additionally, you will manage deployment pipelines and version control, administer Power BI environments, including workspace management, security, and content sharing, and leverage Power BI Embedded or REST API for advanced integration and automation. To qualify for this role, you should have a Bachelor's degree in computer science, Information Systems, Data Science, or a related field, along with 2 to 5 years of hands-on experience in developing Power BI reports and dashboards. Proven expertise in DAX, Power Query, and data modelling is essential. Experience with Databricks, familiarity with Python, R, or other data analysis tools, and Power BI certification are considered advantageous. Join us at Ignitho Inc. and be a part of our innovative and dynamic team shaping the future of AI and data engineering.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a skilled professional in data handling and processing, you will demonstrate expertise in SQL Server and query optimization, ensuring efficient application data design and process management. Your knowledge in data modeling will be extensive, supported by hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric. Additionally, your familiarity with Azure Databricks will be utilized in your work. In the realm of data warehouse development, your proficiency in SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services) will be essential. You will excel in ETL processes, encompassing data extraction, transformation, and loading, including data cleaning and normalization. Moreover, your exposure to big data technologies like Hadoop, Spark, and Kafka will enable you to handle large-scale data processing efficiently. Your understanding of data governance, compliance, and security measures within Azure environments will be integral to your role. Furthermore, your expertise in data analysis, statistical modeling, and machine learning techniques will drive insightful decision-making. Proficiency in analytical tools such as Python, R, and libraries like Pandas and NumPy will be leveraged for data analysis and modeling. You will showcase a strong command of Power BI for data visualization, data modeling, and DAX queries, following best practices in the field. Implementing Row-Level Security in Power BI will be one of your key responsibilities. Handling medium-complex data models and quickly grasping application data design and processes will be part of your routine tasks. In addition to technical skills, you will lead a team of 4-5 developers, ensuring timely deliverables and fostering a culture of continuous learning. Your communication skills in English, both written and verbal, will be crucial for effective collaboration with customers. You will adeptly explain complex technical concepts to non-technical stakeholders, showcasing your ability to bridge the gap between technical and non-technical perspectives. Your proficiency in SQL, Azure Synapse Analytics, Azure Analysis Service, and Data Marts will enable effective data management. Utilizing ETL tools like Azure Data Factory, Azure Data Bricks, Python, and SSIS will streamline your workflow. Data visualization will be a key aspect of your role, with Power BI and DAX serving as your primary tools for creating impactful visual representations of data. Overall, as a data professional, you will be responsible for handling and processing data efficiently, leveraging your technical expertise and non-technical skills to drive valuable insights and support strategic decision-making.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies