Jobs
Interviews

1055 Etl Processes Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As the Lead Power BI Developer at Stackular, you will play a crucial role in designing, developing, and maintaining business intelligence solutions utilizing Microsoft's Power BI platform. Your primary responsibility will involve working closely with stakeholders to comprehend business requirements, generate insightful reports and dashboards, and deliver data-driven insights that steer decision-making processes. Your expertise in data modeling, data visualization, and transforming intricate data sets into actionable business intelligence will be essential for success in this role. In your capacity as a Lead Power BI Developer, you will be tasked with various key responsibilities. This includes designing, developing, and maintaining Power BI reports and dashboards, creating data models to support reporting needs, and implementing best practices in data visualization to provide clear and actionable insights. Furthermore, you will be responsible for connecting to diverse data sources such as SQL databases, Excel, and cloud-based data sources, optimizing DAX calculations and queries, and ensuring data accuracy and integrity through ETL processes. Collaboration will be a significant aspect of your role as you will collaborate closely with business stakeholders to gather and comprehend requirements, work alongside data engineers and team members to ensure seamless data integration, and offer training and support to end-users on Power BI functionalities. Moreover, you will be expected to monitor and optimize the performance of Power BI solutions, troubleshoot and resolve data quality and performance issues, and uphold the highest standards of quality in everything you do. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. A master's degree would be considered a plus. Additionally, a minimum of 8+ years of experience in business intelligence and data analytics is required, along with a proven track record in developing Power BI solutions. Proficiency in Power BI, including DAX and Power Query, SQL, data modeling, and ETL processes is essential. Experience with Azure Data Services such as Azure SQL and Azure Data Factory would be advantageous, and familiarity with other BI tools is considered a plus. Apart from technical skills, soft skills are also valued for this role. Strong analytical and problem-solving abilities, excellent communication and presentation skills, the capacity to work independently and collaboratively, and a keen attention to detail and commitment to quality are attributes that will contribute to your success as a Lead Power BI Developer at Stackular. In return, we offer a culture deeply rooted in our core values, a competitive salary and benefits package, opportunities for personal and professional growth, and the chance to work for a company with a really cool name. Join us at Stackular and be a part of our dynamic product development community where your skills and values align with our shared vision.,

Posted 1 month ago

Apply

6.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

About Calfus: Calfus is a Silicon Valley headquartered software engineering and platforms company with a vision deeply rooted in the Olympic motto "Citius, Altius, Fortius Communiter". At Calfus, we aim to inspire our team to rise faster, higher, and stronger while fostering a collaborative environment to build software at speed and scale. Our primary focus is on creating engineered digital solutions that drive positive impact on business outcomes. Upholding principles of #Equity and #Diversity, we strive to create a diverse ecosystem that extends to the broader society. Join us at #Calfus and embark on an extraordinary journey with us! Position Overview: As a Data Engineer specializing in BI Analytics & DWH, you will be instrumental in crafting and implementing robust business intelligence solutions that empower our organization to make informed, data-driven decisions. Leveraging your expertise in Power BI, Tableau, and ETL processes, you will be responsible for developing scalable architectures and interactive visualizations. This role necessitates a strategic mindset, strong technical acumen, and effective collaboration with stakeholders across all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution aligning with business requirements, utilizing tools like Power BI and Tableau. - Data Integration: Supervise ETL processes through SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Establish and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Employ SQL for crafting intricate queries, stored procedures, and managing data transformations via joins and cursors. - Visualization Development: Spearhead the design of interactive dashboards and reports in Power BI and Tableau while adhering to best practices in data visualization. - Collaboration: Engage closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyze and optimize BI solutions for enhanced performance, scalability, and reliability. - Data Governance: Implement data quality and governance best practices to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts to cultivate a culture of continuous learning and improvement. - Azure Databricks: Utilize Azure Databricks for data processing and analytics to seamlessly integrate with existing BI solutions. Qualifications: - Bachelor's degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong emphasis on Power BI and Tableau. - Proficiency in ETL processes and tools, particularly SSIS. Strong command over SQL Server, encompassing advanced query writing and database management. - Proficient in exploratory data analysis using Python. - Familiarity with the CRISP-DM model. - Ability to work with various data models and databases like Snowflake, Postgres, Redshift, and MongoDB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and Dash. - Strong programming foundation in Python for data manipulation, analysis, serialization, database interaction, data pipeline and ETL tools, cloud services, and more. - Familiarity with Azure SDK is a plus. - Experience with code quality management, version control, collaboration in data engineering projects, and interaction with REST APIs and web scraping tasks is advantageous. Calfus Inc. is an Equal Opportunity Employer.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The Salesforce Data Cloud Analyst will play a crucial role in leveraging Salesforce Data Cloud to transform how our organization uses customer data. This position sits within the Data Cloud Business Enablement Team and focuses on building, managing, and optimizing our data unification strategy to power business intelligence, marketing automation, and customer experience initiatives. You will be responsible for managing data models within Salesforce Data Cloud to ensure optimal data harmonization across multiple sources. Additionally, you will maintain data streams from various platforms into Data Cloud, including CRM, SFMC, MCP, Snowflake, and third-party applications. Developing and optimizing SQL queries to transform raw data into actionable insights will be a key aspect of your role. As a Salesforce Data Cloud Analyst, you will collaborate with marketing teams to translate business requirements into effective data solutions. Monitoring data quality and implementing processes to ensure accuracy and reliability will also be part of your responsibilities. Furthermore, you will create documentation for data models, processes, and best practices, as well as provide training and support to business users on leveraging Data Cloud capabilities. To be successful in this role, you should possess advanced knowledge of Salesforce Data Cloud architecture and capabilities, strong SQL skills for data transformation and query optimization, and experience with ETL processes and data integration patterns. Understanding of data modeling principles, data privacy regulations, and compliance requirements is essential. A Bachelor's degree in Computer Science, Information Systems, or related field, along with 5+ years of experience working with Salesforce platforms, is required. Salesforce Data Cloud certification is preferred. The role offers the opportunity to shape how our organization leverages customer data to drive meaningful business outcomes and exceptional customer experiences. If you have a background in marketing technology or customer experience initiatives, previous work with Customer Data Platforms (CDPs), experience with Tableau CRM or other visualization tools, Salesforce Administrator or Developer certification, and familiarity with Agile ways of working, Jira, and Confluence, it would be beneficial. Novartis is committed to creating an outstanding, inclusive work environment and diverse teams that are representative of the patients and communities served. If you require reasonable accommodation due to a medical condition or disability, please reach out to [email protected] to discuss your needs. Join us at Novartis and become part of a community dedicated to making a positive impact on people's lives through innovative science and collaboration. Visit our website to learn more about our mission and culture. If this role is not the right fit for you, consider joining our talent community to stay informed about future career opportunities within Novartis. Explore our handbook to discover the benefits and rewards we offer to support your personal and professional growth.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

NTT DATA is looking for a PowerBI Data Visualisation Engineer to join their team in Ban/Hyd/Chn/Gur/Noida, Karntaka (IN-KA), India. As a PowerBI Data Visualisation Engineer, you will be responsible for designing, developing, and maintaining interactive dashboards and reports using PowerBI. You will collaborate with data analysts, data engineers, designers, and business stakeholders to gather requirements and ensure accurate, timely, and effective delivery of data visualisations. One of your key responsibilities will be to transform raw data into meaningful insights, ensuring data accuracy and consistency. You will also need to optimize dashboards for performance and usability, providing users with intuitive and efficient access to key metrics. It is essential to stay updated on the latest trends and best practices in data visualization and continuously seek opportunities to enhance the company's data analytics capabilities. In terms of technical knowledge, you must have expertise in PowerBI, PowerAutomate, Azure DevOps, SQL, Databricks, data modelling, data warehousing, ETL processes, and Python. Additionally, having soft skills such as exceptional analytical skills, attention to detail, excellent communication skills, creativity in proposing innovative visualization solutions, and a proactive attitude are crucial for this role. To qualify for this position, you should have proven experience as a Data Visualisation Engineer or a similar role, proficiency in PowerBI, a strong understanding of SQL, data warehousing, and ETL processes. Experience with programming languages such as Python, R, or Javascript is considered a plus. NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Global Top Employer, they have diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA invests significantly in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are a highly skilled, detail-oriented, and motivated Python DQ Automation Developer who will be responsible for designing, developing, and maintaining data quality automation solutions using Python. With a deep understanding of data quality principles, proficiency in Python, and experience in data processing and analysis, you will play a crucial role in ensuring accurate and timely data integration and transformation. Your key responsibilities will include designing, developing, and implementing data quality automation processes and solutions to identify, measure, and improve data quality. You will write and optimize Python scripts using libraries such as Pandas, NumPy, and PySpark for data manipulation and processing. Additionally, you will develop and enhance ETL processes, analyze data sets to identify data quality issues, and develop and execute test plans to validate the effectiveness of data quality solutions. As a part of the team, you will maintain comprehensive documentation of data quality processes, procedures, and standards, and collaborate closely with data analysts, data engineers, DQ testers, and other stakeholders to understand data requirements and deliver high-quality data solutions. Required Skills: - Proficiency in Python and related libraries (Pandas, NumPy, PySpark, pyTest). - Experience with data quality tools and frameworks. - Strong understanding of ETL processes and data integration. - Familiarity with data governance and data management principles. - Excellent analytical and problem-solving skills with a keen attention to detail. - Strong verbal and written communication skills to explain technical concepts to non-technical stakeholders. - Ability to work effectively both independently and as part of a team. Qualifications: - Bachelor's degree in computer science or Information Technology. An advanced degree is a plus. - Minimum of 7 years of experience in data quality automation and Python Development. - Proven experience with Python libraries for data processing and analysis. Citi is an equal opportunity and affirmative action employer, encouraging all qualified and interested applicants to apply for career opportunities. If you require a reasonable accommodation due to a disability, please review Accessibility at Citi.,

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

indore, madhya pradesh

On-site

As a SQL Developer with 5-10 years of experience, you will be a valuable addition to our IT team focusing on the development of a new financial platform. This platform is designed to support valuation teams in their daily, monthly, and quarterly marking of credit investments. Your responsibilities will include connecting to multiple pricing sources, implementing pricing rules, managing approval workflows, and integrating with downstream applications for reporting and further processing. This role offers a unique opportunity to actively contribute to the development of a platform from the ground up. It entails working closely with a diverse team of developers, infrastructure engineers, and QA specialists, providing a dynamic and collaborative work environment. Key qualifications for this role include a minimum of 5 years of SQL development experience within the financial services or a related industry. You should possess a strong grasp of database design principles, particularly in SQL Server, encompassing DDL, DML, tables, views, stored procedures, functions, indexes, and Dynamic SQL. Your experience should also include handling large datasets, query optimization, and resolving SQL-related issues efficiently. An understanding of financial products and valuation concepts such as discount rates, cap rates, and exit multiples would be advantageous. Proficiency in integration services, reporting platforms, workflow automation, and familiarity with cloud databases are also desirable. You should exhibit strong problem-solving and analytical skills, enabling you to troubleshoot database-related challenges effectively. In addition to the required skills, experience with Azure SQL, AWS RDS, or other cloud-based database solutions is preferred. Knowledge of scripting languages like Python for automation, exposure to data warehousing, and ETL processes would be beneficial in this role. If you are a proactive individual who thrives in a fast-paced Agile environment and can work both independently and collaboratively, we encourage you to apply and be a part of our innovative financial platform development team.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Data and Analytics Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. You will lead a high-performing team, fostering a collaborative and innovative culture, and ensuring data integrity, consistency, and availability across the organization. You will manage the existing MDM solution and data platform based on Microsoft Data Lake Gen 2, Snowflake as the DWH, and Power BI managing data from core applications. Additionally, you will drive further development to handle additional data and capabilities to support our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Develop and implement the MDM and analytics strategy aligned with the overall team and organizational goals. - Work with the Enterprise architect to align on the overall strategy and application landscape to ensure MDM and data analytics fit into the ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Develop business cases and proposals for IT investments and present them to senior management and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Strong knowledge of master data management concepts, data governance, data technology, and analytics tools. - Proficiency in data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills. - Team player, result-oriented, structured, with attention to detail and a strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting and presenting. - Strong executional skills to make things happen, not just generate ideas. - Experience in working with analytics tools and data ingestion platforms. - Experience in working with MDM solutions and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

thrissur, kerala

On-site

As a Data Engineer at WAC, you will be responsible for ensuring the availability, reliability, and scalability of the data infrastructure. Your role will involve collaborating closely with cross-functional teams to support data-driven initiatives, enabling data scientists, analysts, and business stakeholders to access high-quality data for critical decision-making. You will be involved in designing, developing, and maintaining efficient ETL processes and data pipelines to collect, process, and store data from various sources. Additionally, you will create and manage data warehouses and data lakes, optimizing storage and query performance for both structured and unstructured data. Implementing data quality checks, validation processes, and error handling will be crucial in ensuring data accuracy and consistency. Administering and optimizing relational and NoSQL databases to ensure data integrity and high availability will also be part of your responsibilities. Identifying and addressing performance bottlenecks in data pipelines and databases to improve overall system efficiency is another key aspect of the role. Furthermore, implementing data security measures and access controls to protect sensitive data assets will be essential. Collaboration with data scientists, analysts, and stakeholders to understand their data needs and provide support for analytics and reporting projects is an integral part of the job. Maintaining clear and comprehensive documentation for data processes, pipelines, and infrastructure will also be required. Monitoring data pipelines and databases, proactively identifying issues, and troubleshooting and resolving data-related problems in a timely manner are vital aspects of the position. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, with at least 4 years of experience in data engineering roles. Proficiency in programming languages such as Python, Java, or Scala is necessary. Experience with data warehousing solutions and database systems, as well as a strong knowledge of ETL processes, data integration, and data modeling, are also required. Familiarity with data orchestration and workflow management tools, an understanding of data security best practices and data governance principles, excellent problem-solving skills, and the ability to work in a fast-paced, collaborative environment are essential. Strong communication skills and the ability to explain complex technical concepts to non-technical team members are also important for this role. Thank you for your interest in joining the team at Webandcrafts. We look forward to learning more about your candidacy through this application.,

Posted 1 month ago

Apply

5.0 - 8.0 years

4 - 6 Lacs

Hyderabad, Telangana, India

On-site

Job descriptionTechnical & Professional requirements:Experience with Hadoop ecosystem, particularly with Hive for data querying and analysisExperience with data modeling, ETL processes Should have good knowledge of MySQL and able to write complex queries, stored procedures and query optimizationCapable of working with large datasets , performing data analysis.Ability to work closely and mentoring the team, contribute to the discussions and present findings clearlyExp5-8 yearsSkillBig data, Hive, Spark, Sqoop, MySQL

Posted 1 month ago

Apply

3.0 - 6.0 years

4 - 6 Lacs

Hyderabad, Telangana, India

On-site

Detailed JD (Roles and Responsibilities) Strong expertise in data warehouse and ETL testing concepts Expertise in Informatica /IDMC, Teredata /Postgress, Unix, manual and Automation testing, Agile ways of working Strong SQl skills Mandatory skills Informatica/IDMC, Teradata/Postgress,Unix ,Automation(Java/python) Desired/ Secondary skills AWS

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 6 Lacs

Hyderabad, Telangana, India

On-site

We are seeking an experienced Azure Data Engineer to join our team in India. The ideal candidate will have a strong background in data engineering and a deep understanding of Azure cloud services. This role involves designing and implementing robust data solutions that meet the needs of our organization. Responsibilities Design and implement data solutions on Azure platform. Develop and maintain ETL processes using Azure Data Factory. Optimize data storage and retrieval using Azure SQL Database and Azure Synapse Analytics. Collaborate with data scientists and analysts to understand data requirements and provide appropriate data solutions. Ensure data quality and integrity by performing regular data validation and cleansing tasks. Monitor and troubleshoot data pipelines and workflows to ensure smooth operation. Skills and Qualifications 2-6 years of experience in data engineering or related field. Proficiency in Azure cloud services, particularly Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Strong knowledge of data modeling, ETL processes, and data warehousing concepts. Experience with programming languages such as Python, SQL, or Scala. Familiarity with data visualization tools like Power BI or Tableau. Understanding of big data technologies such as Azure HDInsight or Azure Databricks.

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are a highly skilled Lead Data Engineer (Snowflake) with 7 to 10 years of experience, seeking to join a dynamic team in Ahmedabad or Pune. Your expertise includes extensive knowledge of Snowflake, cloud platforms, ETL processes, data warehousing concepts, and various programming languages. If you are passionate about working with large datasets, designing scalable database schemas, and solving complex data problems, we are excited to welcome you aboard! Your responsibilities will involve designing, developing, and optimizing data pipelines using Snowflake and ELT/ETL tools. You will be tasked with architecting, implementing, and maintaining data warehouse solutions, ensuring high performance and scalability. Additionally, you will design efficient database schemas and data models to support business needs, write and optimize complex SQL queries, and develop data transformation scripts using Python, C#, or Java for process automation. Ensuring data integrity, security, and governance throughout the data lifecycle will be paramount, as you analyze, troubleshoot, and resolve data-related issues at various strategic levels. Collaboration with cross-functional teams to comprehend business requirements and deliver data-driven solutions will be a key aspect of your role. **Qualifications (Must Have):** - Strong experience with Snowflake. - Deep understanding of transactional databases, OLAP, and data warehousing concepts. - Experience in designing database schemas and data models. - Proficiency in one programming language (Python, C#, or Java). - Strong problem-solving and analytical skills. **Good to Have:** - Snowpro Core or Snowpro Advanced certificate. - Experience with cost/performance optimization. - Client-facing experience with the ability to understand business needs. - Ability to work collaboratively in a team environment. **Perks:** - Flexible Timings - 5 Days Working - Healthy Environment - Celebration - Learn and Grow - Build the Community - Medical Insurance Benefit,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Senior Frontend Data Visualization Engineer at Bidgely, you will play a crucial role in creating exceptional UI experiences for energy analytics applications. Leveraging your expertise in React.js and Looker, you will be responsible for developing, optimizing, and maintaining interactive dashboards and web applications. Your focus will be on ensuring seamless production support and deployment while turning data into actionable insights. If you are a problem-solver who thrives in a collaborative environment, we are looking for someone like you. Your key responsibilities will include Frontend Development & Optimization, Post-Release Monitoring & Performance Analysis, Collaboration & Communication, and Documentation & Release Management. You will be developing and maintaining high-performance React.js applications, designing and optimizing Looker dashboards, implementing advanced filtering and drill-down capabilities, and ensuring cross-browser compatibility and responsiveness. Additionally, you will monitor the performance and stability of deployed applications, troubleshoot production issues, collaborate with product teams, and provide technical solutions to stakeholders. To excel in this role, you should have at least 2 years of experience in BI development and Data Analytics in cloud platforms. Proficiency in React.js and Looker, strong SQL skills, experience with REST APIs, and familiarity with CI/CD pipelines are essential. You should also possess excellent collaboration, communication, and problem-solving skills, along with a strong understanding of non-functional requirements related to security, performance, and scale. Experience with Git, Confluence, and Notion for version control and documentation is preferred. In return, Bidgely offers Growth Potential with a Startup, a Collaborative Environment, Unique Tools for your role, Group Health Insurance, Internet/Telephone Reimbursement, Professional Development Allowance, Gratuity, Mentorship Programs from industry experts, and Flexible Work Arrangements. Bidgely is an equal-opportunity employer that values diversity and equal opportunity. Your hiring will be based on your skills, talent, and passion, without any bias towards your background, gender, race, or age. Join us in building a better future and a better workforce at Bidgely, an EVerify employer.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly motivated CDP Analyst with over 3 years of experience in Customer Data Platforms (CDP) and Marketing Automation tools. Your expertise lies in Treasure Data CDP, where you have hands-on experience in data integration, audience segmentation, and activation across various marketing channels. Your role will involve managing data pipelines, constructing customer profiles, and assisting marketing teams in delivering personalized customer experiences. Your primary responsibilities will include: - Implementing, configuring, and managing Treasure Data CDP for collecting, unifying, and activating customer data. - Developing and maintaining data pipelines to ingest data from multiple sources into the CDP. - Creating and managing audience segments based on behavioral, transactional, and demographic data. - Collaborating closely with Marketing and CRM teams to integrate CDP data with various marketing automation platforms. - Setting up and monitoring data activation workflows to ensure accurate targeting across paid media, email, SMS, and push notifications. - Leveraging CDP data to generate actionable customer insights and support campaign personalization. - Monitoring data quality, creating reports, and optimizing customer journeys based on performance data. - Partnering with Data Engineering, Marketing, and IT teams to enhance data strategies and address data challenges. To excel in this role, you should possess: - 3+ years of experience working with CDP platforms, preferably Treasure Data, and Marketing Automation tools. - Strong SQL skills for querying and data manipulation. - Experience in integrating CDP with marketing channels such as Google Ads, Meta, Salesforce, Braze, etc. - Familiarity with APIs, ETL processes, and data pipelines. - Knowledge of customer journey orchestration and data privacy regulations like GDPR and CCPA. - Strong analytical and problem-solving abilities. - Excellent communication skills and the ability to collaborate effectively with cross-functional teams.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a dynamic Tableau Analyst who will be joining our Tools and Reporting team, playing a pivotal role in crafting insightful and actionable dashboards, and providing essential PMO support for successful project delivery. Your strong data experience, including proficiency in data manipulation tools such as Tableau, Python, or Knime, and familiarity with project management methodologies are essential for this role. You will be responsible for designing, developing, and maintaining interactive Tableau dashboards to visualize complex datasets effectively. Using Tableau Prep for data cleaning, transformation, and preparation for analysis is also a key responsibility. Developing complex calculations and custom formulas within Tableau to meet specific analytical needs, optimizing dashboard performance, ensuring data accuracy and integrity, and collaborating with stakeholders to gather requirements and translate them into effective dashboard designs are crucial aspects of this role. You will also assist in planning, executing, and monitoring projects related to data visualization and analytics, applying project management methodologies such as Agile, Iterative, and Waterfall to manage project tasks and timelines. Tracking project progress, identifying potential risks, proactively addressing issues, communicating project status updates to stakeholders, and maintaining project documentation and artifacts are part of your PMO support responsibilities. Moreover, you should demonstrate a strong understanding of data concepts, including data modeling, data warehousing, and ETL processes, and be familiar with Tableau Prep, Python, or Knime to manipulate and transform data for use in Tableau dashboards. Understanding database concepts and writing SQL queries to extract data are also required. A minimum of 5 years of hands-on experience is necessary for both Tableau development and data expertise aspects of this role. You must possess proven experience as a Tableau Developer with a strong portfolio of interactive dashboards, proficiency in Tableau Prep, experience in creating complex calculations and custom formulas in Tableau, solid understanding of data concepts, and experience with Python and Knime for data manipulation. Familiarity with project management methodologies, excellent communication, collaboration skills, knowledge of PTS (Project Tracking System) or similar project management tools, experience with data visualization best practices, and Tableau certification are all essential qualifications for this position. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. This job description provides a high-level overview of the types of work performed, and other job-related duties may be assigned as required.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You should have expertise in Neo4j, including its core concepts, Cypher query language, and best practices. You will be responsible for designing and implementing graph database solutions, creating and maintaining graph schemas, models, and architectures. Your role will involve migrating data from relational or other databases into Neo4j and optimizing Cypher queries for performance to ensure efficient data retrieval and manipulation. It is essential to have familiarity with graph theory, graph data modeling, and other graph database technologies. You will also be developing and optimizing Cypher queries, as well as integrating Neo4j with BI and other systems. Additionally, it would be good to have experience in developing Spark applications using Scala or Python (Pyspark) for data transformation, aggregation, and analysis. You should also be able to develop and maintain Kafka-based data pipelines, create and optimize Spark applications using Scala and PySpark, and have proficiency in the Hadoop ecosystem big data tech stack (HDFS, YARN, MapReduce, Hive, Impala). Furthermore, hands-on expertise in building Neo4j Graph solutions, Spark (Scala, Python) for data processing and analysis, Kafka for real-time data ingestion and processing, ETL processes, data ingestion tools, Pyspark, Scala, and Kafka would be beneficial for this role. You will be part of the Technology department in the Applications Development job family, working full-time to leverage your skills and experience in graph databases, data processing, and big data technologies to drive impactful business solutions. If you require a reasonable accommodation due to a disability to use search tools or apply for a career opportunity, please review Citi's Accessibility information. Additionally, you can refer to Citis EEO Policy Statement and the Know Your Rights poster for more details.,

Posted 1 month ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

As a Senior Principal Software Engineer at Streamline, you will play a crucial role in designing, developing, and implementing innovative software solutions tailored for the healthcare domain. Your primary focus will be on leveraging a variety of technologies and frameworks to build robust, scalable solutions that align with business objectives and technical requirements. Your responsibilities will include leading the end-to-end development process of healthcare software solutions, with a particular emphasis on ETL processes. You will architect and implement ETL solutions using SSIS and Azure Data Factory to facilitate data movement and transformation. Additionally, you will be responsible for creating and optimizing efficient SQL queries and scripts, collaborating with team members to ensure seamless database interactions. In this role, you will also analyze and enhance SQL performance, utilizing advanced techniques such as SQL Profiler and Execution Plans to identify bottlenecks and improve query efficiency. Furthermore, you will develop and maintain healthcare reporting systems using SSRS to provide actionable insights. Collaboration will be a key aspect of your role, as you will work closely with cross-functional teams including data architects, business analysts, and stakeholders to ensure the accurate implementation of data flows and business logic. Moreover, you will provide mentorship to junior and mid-level engineers, assisting them in their technical growth and promoting best practices within the team. Staying updated with the latest trends and technologies in software development and healthcare industries will be essential. By doing so, you will ensure that the team is equipped with the most effective tools and practices. Additionally, you will participate in code reviews to maintain high code quality and adherence to software engineering best practices. To excel in this role, you should possess over 12 years of experience in software engineering, with a strong focus on healthcare data systems. Your expertise should include enterprise-class software architecture, ETL processes, SQL Server, SQL performance tuning, and report development using SSRS. Familiarity with Microsoft tech stack hosted on Azure and exposure to ASP.Net, C#.Net, and JQUERY will be advantageous. Strong analytical and problem-solving skills, along with excellent communication and collaboration abilities, are essential for success in this position. Experience with cloud-based healthcare solutions, HIPAA regulations, and Scrum methodologies would be beneficial. If you are passionate about leveraging technology to improve behavioral health and quality of life for those in need, we look forward to hearing from you. Join us at Streamline and be a part of our mission to empower healthcare organizations with cutting-edge software solutions.,

Posted 1 month ago

Apply

6.0 - 13.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

We are seeking a highly experienced candidate with over 13 years of experience for the role of Technical Project Manager(Data) in Trivandrum/Kochi location. As a Technical Project Manager, your responsibilities will revolve around owning the end-to-end delivery of data platform, AI, BI, and analytics projects. It is essential to ensure alignment with business objectives and stakeholder expectations. Your role will involve developing and maintaining comprehensive project plans, roadmaps, and timelines for various aspects including data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Leading cross-functional teams, comprising data engineers, data scientists, BI analysts, architects, and business stakeholders, to deliver high-quality and scalable solutions within the defined budget and timeframe will be a key aspect of this role. Furthermore, you will be responsible for defining, prioritizing, and managing product and project backlogs covering data pipelines, data quality, governance, AI services, and BI dashboards or reporting tools. Collaboration with business units to capture requirements and translate them into actionable user stories and acceptance criteria for data and analytics solutions is crucial. Overseeing BI and analytics areas, including dashboard development, embedded analytics, self-service BI enablement, and ad hoc reporting capabilities, will also be part of your responsibilities. It is imperative to ensure data quality, lineage, security, and compliance requirements are integrated throughout the project lifecycle in collaboration with governance and security teams. Coordinating UAT, performance testing, and user training to ensure successful adoption and rollout of data and analytics products is vital. Acting as the primary point of contact for all project stakeholders, providing regular status updates, managing risks and issues, and escalating when necessary are essential aspects of this role. Additionally, facilitating agile ceremonies such as sprint planning, backlog grooming, demos, and retrospectives to foster a culture of continuous improvement is expected. Driving post-deployment monitoring and optimization of data and BI solutions to meet evolving business needs and performance standards is also a key responsibility. Primary Skills required for this role include: - Over 13 years of experience in IT with at least 6 years in roles such as Technical Product Manager, Technical Program Manager, or Delivery Lead - Hands-on development experience in data engineering, including data pipelines, ETL processes, and data integration workflows - Proven track record in managing data engineering, analytics, or AI/ML projects end to end - Solid understanding of modern data architecture, data lakes, warehouses, pipelines, ETL/ELT, governance, and AI tooling - Hands-on familiarity with cloud platforms (e.g., Azure, AWS, GCP) and DataOps/MLOps practices - Strong knowledge of Agile methodologies, sprint planning, and backlog grooming - Excellent communication and stakeholder management skills, including working with senior execs and technical leads Secondary Skills that would be beneficial for this role include: - Background in computer science, engineering, data science, or analytics - Experience or solid understanding of data engineering tools and services in AWS, Azure & GCP - Exposure or solid understanding of BI, Analytics, LLMs, RAG, prompt engineering, or agent-based AI systems - Experience leading cross-functional teams in matrixed environments - Certifications such as PMP, CSM, SAFe, or equivalent are a plus If you meet the above requirements and are looking for a challenging opportunity in Technical Project Management within the data domain, we encourage you to apply before the closing date on 18-07-2025.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Business Intelligence (BI) Analyst at SolarWinds, you will play a pivotal role in driving data-driven decision-making throughout the organization. Your strategic mindset and expertise in BI tools, data visualization, and advanced analytics will be crucial in transforming raw data into actionable insights that enhance business performance and operational efficiency. Your responsibilities will include developing, maintaining, and optimizing BI dashboards and reports to support business decision-making. You will extract, analyze, and interpret complex datasets from multiple sources to identify trends and opportunities. Collaborating with cross-functional teams, you will define business intelligence requirements and deliver insightful solutions. Presenting key findings and recommendations to senior leadership and stakeholders will be a key aspect of your role. Ensuring data accuracy, consistency, and governance by implementing best practices in data management will be essential. You will also conduct advanced analytics to drive strategic initiatives and mentor junior BI analysts to enhance the overall team capability. To excel in this role, you should hold a Bachelor's degree in Business Analytics, Computer Science, or a related field, along with at least 5 years of experience in business intelligence, data analysis, or a similar role. Proficiency in BI tools such as Tableau, Power BI, and SQL for querying and data manipulation is required. Experience with ETL processes, data warehousing, and database management is important, with expertise in Tableau preferred. An understanding of Google BigQuery and experience with cloud platforms like AWS and Azure will be beneficial. If you are a collaborative, accountable, and empathetic individual who thrives in a fast-paced environment and believes in the power of teamwork to drive lasting growth, then SolarWinds is the place for you. Join us in our mission to accelerate business transformation with simple, powerful, and secure solutions. Grow your career with us and make a meaningful impact in a people-first company. Please note that all applications are treated in accordance with the SolarWinds Privacy Notice.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

As an experienced Data Engineer with 8+ years of industry experience, you will be responsible for developing robust, scalable Python-based applications that meet the company's requirements. Your key responsibilities will include integrating and implementing Generative AI models into business applications, designing, building, and maintaining data pipelines and data engineering solutions on Azure, and collaborating closely with cross-functional teams to define, design, and deploy innovative AI and data solutions. You will be required to build, test, and optimize AI pipelines, ensuring seamless integration with Azure-based data systems. Continuous research and evaluation of new AI and Azure data technologies to enhance system capabilities will be essential. Additionally, you will participate actively in code reviews, troubleshooting, debugging, and documentation to maintain high standards of code quality, performance, security, and reliability. To excel in this role, you must possess advanced proficiency in Python programming, including knowledge of libraries and frameworks like Django, Flask, and FastAPI. Experience in Generative AI technologies such as GPT models, LangChain, and Hugging Face will be beneficial. Solid expertise in Azure Data Engineering tools like Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Data Lake Storage is required. Familiarity with AI/ML libraries like TensorFlow, PyTorch, or OpenAI API, RESTful APIs, microservices architecture, and web application development is essential. You should also have a strong understanding of databases (SQL, NoSQL), ETL processes, containerization, and orchestration technologies like Docker and Kubernetes. Strong problem-solving, analytical, and debugging skills are a must-have for this role. Preferred qualifications include a Bachelor's or Master's degree in computer science, engineering, or related fields, prior experience developing AI-enabled products or implementing AI into applications, and Azure certifications (AZ-204, DP-203, AI-102) or equivalent. Exposure to DevOps practices and CI/CD pipelines, especially in Azure DevOps, will be an added advantage. In addition to technical skills, soft skills such as strong communication, teamwork, ability to work independently, and a passion for continuous learning and professional growth are valued. This full-time position is located in Gurgaon, Noida, Pune, Bengaluru, or Kochi. Join us at Infogain, a human-centered digital platform and software engineering company based in Silicon Valley, where we engineer business outcomes for Fortune 500 companies and digital natives across various industries. We accelerate experience-led transformation in the delivery of digital platforms using cutting-edge technologies such as cloud, microservices, automation, IoT, and artificial intelligence. Infogain is a Microsoft Gold Partner and Azure Expert Managed Services Provider, with global offices and delivery centers in multiple locations worldwide.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

At Lilly, the focus is on uniting caring with discovery to enhance the lives of people worldwide. As a global healthcare leader headquartered in Indianapolis, Indiana, we are committed to developing and providing life-changing medicines, advancing disease management, and contributing to communities through philanthropy and volunteerism. Our dedicated team of 35,000 employees collaborates to prioritize people and strive towards making a positive impact globally. The Enterprise Data organization at Lilly has pioneered an integrated data and analytics platform designed to facilitate the efficient processing and analysis of data sets across various environments. As part of this team, you will play a crucial role in managing, monitoring, and optimizing the flow of high-quality data to support data sharing and analytics initiatives. Your responsibilities will include monitoring data pipelines to ensure smooth data flow, managing incidents to maintain data integrity, communicating effectively with stakeholders regarding data issues, conducting root cause analysis to enhance processes, optimizing data pipeline performance, and implementing measures to ensure data accuracy and reliability. Additionally, you will be involved in cloud cost optimization, data lifecycle management, security compliance, automation, documentation, and collaboration with various stakeholders to improve pipeline performance. To excel in this role, you should possess a Bachelor's Degree in Information Technology or a related field, along with at least 5 years of work experience in Information Technology. Strong analytical, collaboration, and communication skills are essential, along with the ability to adapt to new technologies and methodologies. Proficiency in ETL processes, SQL, AWS services, CI/CD, Apache Airflow, and ITIL practices is required. Certification in AWS and experience with agile frameworks are preferred. If you are passionate about leveraging technology to drive innovation in the pharmaceutical industry and are committed to ensuring data integrity and security, we invite you to join our team in Hyderabad, India. Embrace the opportunity to contribute to Lilly's mission of making life better for individuals worldwide.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Supply Chain Transformation Business Analyst at Nextracker, you will play a crucial role in shaping the functional design and direction of supply chain process and system solutions to meet the needs of both internal and external stakeholders. Your responsibilities will include driving the selection, implementation, and support of various Supply Chain system solutions while serving as the lead product owner on a daily basis. You will work alongside the Supply Chain Business team to contribute to the overall success of the Supply Chain systems Digital Transformation program. In this role, you will be tasked with defining the current state of the Supply chain System architecture, conducting AS IS Process mapping to gain insights into the existing Business Process and systems alignment, and identifying Process level Gaps, Pain Points, and Limitations of the current Business Systems tools utilized by different groups within the supply chain team. Additionally, you will be responsible for documenting TO BE Process and System Solution Requirements for end-to-end supply chain operations and creating Detailed Use Cases for Process-Specific TO BE solutions by collaborating closely with relevant business stakeholders. Your role will also involve conducting Functional Testing and defect triaging with the team, documenting requirements, Standard Operating Procedures, Help Manuals, and supporting activities such as UAT, Cutover, and Go Live processes. To excel in this position, you should ideally possess 8-10 years of overall experience in Design, Development, and Implementation of Supply chain Business Systems, with specific expertise in Logistics Planning, Logistics Execution, Supply Chain Visibility, and integrating Transport Management systems with other ecosystem applications. Your knowledge of Business requirements documentation and development, including Process flow diagrams, Process/functional requirements definition methods, Product and Sprint backlog creation and maintenance, Detailed Use case/User story development, and Release/Go Live activities will be invaluable. In addition to your technical skills, you should have a strong analytical mindset, troubleshooting abilities, problem-solving skills, and experience with Agile Project Management. Excellent communication and presentation skills will be essential in collaborating effectively with cross-functional teams and stakeholders. Proficiency in Microsoft Office applications is also required for this role. Joining Nextracker means being part of a leading company in the energy transition space, offering a comprehensive portfolio of intelligent solar tracker and software solutions for solar power plants. At Nextracker, sustainability is at the core of our business, values, and operations, with a focus on People, Community, Environment, Innovation, and Integrity. Our diverse and passionate teams work together to provide smart solar and software solutions for our customers, contributing to the global efforts to mitigate climate change for future generations. If you are a creative, collaborative problem-solver with a passion for making a difference, we welcome you to be a part of our culture-driven team at Nextracker.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Consultant, Salesforce Developer- Data Cloud at Genpact, you will play a crucial role in designing, developing, and implementing solutions primarily using Data Cloud and Salesforce OMS. Your responsibilities will encompass Data Cloud development, including designing and implementing data pipelines, developing data models and flows, creating data visualizations, and integrating machine learning models. Additionally, you will work on Agentforce development, automating tasks, improving customer service efficiency, and integrating Agentforce with various systems. Your key responsibilities will include designing and implementing data pipelines to ingest, transform, and load data into Data Cloud, developing data models for advanced analytics, creating data visualizations and dashboards for insights communication, and integrating machine learning and AI models into Data Cloud for enhanced analysis and prediction. In the domain of Agentforce development, you will focus on designing, developing, and deploying Agentforce agents for task automation and customer service efficiency improvement. You will also be responsible for writing complex Prompt Builder steps, implementing complex orchestration flows, integrating Agentforce with various systems, training users on best practices, optimizing Agentforce agents, and monitoring them for errors and performance issues. To qualify for this role, you must hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. You should have proven experience in cloud data engineering or a similar role, strong knowledge of cloud platforms like AWS, Azure, or Google Cloud, proficiency in programming languages such as Python, Java, or Scala, experience with data modeling, ETL processes, and data warehousing, excellent problem-solving skills, attention to detail, as well as strong communication and collaboration skills. If you are passionate about leveraging your expertise in Salesforce development, Data Cloud, and Agentforce to drive impactful solutions and contribute to the digital transformation of leading enterprises, this role at Genpact offers you an exciting opportunity to make a difference.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

You are looking for a highly skilled Databricks Developer with expertise in Power BI to become a valuable member of your dynamic team. Your ideal candidate should possess extensive experience in data engineering projects, advanced SQL, and data visualization. This role demands excellent communication, problem-solving, and analytical skills to effectively collaborate with clients and cross-functional teams. As a Databricks Developer at our company, you will be expected to have hands-on experience with Databricks and advanced SQL. Moreover, a strong understanding of data engineering projects is crucial for success in this role. Your exceptional communication, problem-solving, and analytical skills will be put to the test as you interact with clients and work with various teams. If you also bring experience in data modeling, ETL processes, and query optimization, it will be considered a valuable asset. Additionally, knowledge of Azure Synapse Analytics would be beneficial for this position. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Possessing relevant certifications in Databricks, Power BI, or Azure would be a definite advantage and a plus point for your application.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for managing the end-to-end implementation tasks of the Data Warehouse project. This includes ensuring adherence to timelines and quality standards. You will work closely with the development team to design and implement appropriate data models, ETL processes, and reporting solutions. Utilize project management methodologies to create project plans, define project scope, allocate resources, and manage project risks. You will oversee testing procedures with the team to ensure data accuracy, quality, and integrity within the data warehouse. This involves checking test cases and overseeing testing cycles with vendor partners and the business team. Act as a liaison between technical teams, business stakeholders, and other project members to ensure effective communication and collaboration. Manage vendor relationships if applicable, ensuring deliverables meet expectations and align with project goals. Maintain project documentation, prepare progress reports, and present updates to the project manager. Requirements: - Bachelor's or Master's degree in Computer Science, Information Systems, Business Administration, or related field. - Proven experience (3-to-5 years) working as a Project Team, Knowledge of SDLC and project implementations or similar projects. - Excellent communication and interpersonal skills with the ability to convey technical concepts to non-technical stakeholders. - Certification in project management (PMP, PRINCE2, Agile certifications) is a plus.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies