Jobs
Interviews

1052 Etl Processes Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

3 - 15 Lacs

bengaluru, karnataka, india

On-site

5+ Yrs Relevant Experience Senior IT Operations Engineer to have broad technical experience in public cloud, virtualization, storage, database, system management, operations, automation and networking concepts and be able to apply the skills to provide technical support. ESSENTIAL DUTIES AND RESPONSIBILITIES Strategy and Architecture: Undertakes analytical activities and delivers analysis outputs, in accordance with customer needs and conforming to agreed standards. Maintains in depth knowledge of the AWS cloud platforms, provides detailed advice regarding their application, and executes specialized tasks. Development and implementation: Creates, manages, and maintains the DevOps pipelines Specifies user/system interfaces and translates logical designs into physical designs taking account of target environment, performance security requirements and existing systems. Produces detailed designs and documents all work using required standards, methods, and tools, including prototyping tools where appropriate. Assists in the installation and configuration of software and equipment and the systems testing of platform-specific versions of one or more software products. Documents faults, implements resolutions and retests to agreed standards. Designs cloud and networking configurations, taking account of target environment, performance, security, and sustainability requirements. Undertakes routine installations and de-installations of items of hardware and/or software. Takes action to ensure targets are met within established safety and quality procedures. Conducts tests of hardware and/or software using supplied test procedures and diagnostic tools. Corrects malfunctions, calling on other experienced colleagues and external resources if required. Documents details of all hardware/software items that have been installed and removed so that configuration management records can be updated. Develops installation procedures and standards, and schedules installation work. Provides specialist guidance and advice to less experienced colleagues to ensure best use is made of available assets, and to maintain or improve the installation service. Develop, Enhance, reverse engineer, and debug new/existing Infrastructure-as-Code (IaC) capabilities leveraging tools such as CDKs. Delivery and Operation: Contributes to the availability management process and its operation and performs defined availability management tasks. Analyses service and component availability, reliability, maintainability, and serviceability. Ensures that services and components meet and continue to meet all their agreed performance targets and service levels. Implements arrangements for disaster recovery and documents recovery procedures. Conducts testing of recovery procedures. Engages with project team to confirm that products developed meet the service acceptance criteria and are to the required standard. Feeds into change management processes. Applies tools, techniques and processes to track, log and correct information related to CIs, ensuring protection of assets and components from unauthorized change, diversion, and inappropriate use. Develops, documents, and implements changes based on requests for change. Applies change control procedures. Uses the tools and techniques for specific areas of release and deployment activities. Administers the recording of activities, logging of results and documents technical activity undertaken. Reviews system software updates and identifies those that merit action. Installs and tests new versions of system software. Investigates and coordinates the resolution of potential and actual service problems. Prepares and maintains operational documentation for system software. Implement full stack monitoring to ensure infra, cloud platform, OS, apps telemetry. Explore new technologies, development patterns, and participate in pilots/POC/technology evaluations. OTHER DUTIES Performs other duties as assigned by management JOB QUALIFICATIONS & EXPERIENCE Education: Preferred: Bachelors degree in Computer Science, Management Information Systems, or a related field Technical Certificates to a specific technical domain, for example AWS, Azure, ITIL, and DevOps. Experience/Skills: More than 5 years IT Experience More than 5 years experience building and maintaining systems in AWS More than 5 years of DevOps or Application Ops (Azure DevOps) Recent experience with Design and implementation of complex, highly available and highly scalable solutions. Other Knowledge, Skills, Abilities or Certifications: Windows and Linux administration skills. Apache/Tomcat/Java administration. Ansible playbook development skills. Ability to use Confluence for Knowledge Base documentation. Grafana Dashboard Development and Alerting. AWS and/or Azure public cloud certification. Soft Skills Required: Excellent analytical and problem-solving abilities Excellent communication and presentation skills Seniority: Individual Contributor TRAVEL REQUIREMENTS Position requires no travel. PHYSICAL DEMANDS Normal professional office environment.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

3 - 15 Lacs

bengaluru, karnataka, india

On-site

We are seeking an experienced Tableau Developer to join our dynamic team in India. The ideal candidate will have a strong background in data visualization and analytics, with the ability to transform complex data into meaningful insights. You will be responsible for creating engaging dashboards and reports that drive business decisions. Responsibilities Design, develop, and maintain interactive dashboards and reports using Tableau. Work with stakeholders to gather and understand reporting requirements. Optimize Tableau dashboards for performance and usability. Collaborate with data engineers to ensure data quality and accuracy. Conduct data analysis and present findings to support business decisions. Skills and Qualifications 4-6 years of experience in Tableau development. Strong understanding of data visualization principles and best practices. Proficiency in SQL for data extraction and manipulation. Experience with data warehousing concepts and ETL processes. Knowledge of R or Python for advanced analytics is a plus. Ability to work with cross-functional teams and communicate effectively. Familiarity with Agile methodologies and project management tools.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

hyderabad, telangana, india

Remote

BIA Tableau Migration Specialist Work Location: Temp WFH/Remote Shift: 2:30 PM to 11:30 PM / 3:30 PM to 12:30 AM Job Description Key Responsibilities: Lead the migration from Tableau on-premises to Tableau Cloud, ensuring minimal disruption to business operations. Advise on best practices for migration, including data security, performance optimization, and scalability. Advise in organizing Tableau assets in line with Business architecture and requirements today and tomorrow Help and design user account onboarding security implementation and automation Help implement tableau asset design governance to ensure optimized assets Perform hands-on system configuration and conversion tasks. Manage the project lifecycle, including planning, execution, monitoring, and reporting. Collaborate with stakeholders to understand requirements, provide solutions, and ensure alignment with business goals. Develop and maintain comprehensive documentation for the migration process, including technical specifications and user guides. Troubleshoot and resolve issues related to the migration process, providing timely support and solutions. Provide training and support to end-users post-migration, ensuring they are proficient with Tableau Cloud. Provide advice on strategy and tactics for adoption and risk management of AI functionality (e.g., Pulse and Tableau Agent) in the Tableau Cloud Plus platform. Stay updated with the latest Tableau features and industry trends to continuously improve migration strategies. Required Skills: Minimum of 8 years of experience in Tableau projects, including on-premise and cloud environments. Proven experience in managing and advising on at least one similar Tableau Cloud migration. Strong knowledge of Tableau on-premise and Tableau Cloud, including architecture and functionalities. Excellent project management skills, including the ability to manage timelines, budgets, and resources effectively. Proficiency in system configuration, conversion, and ETL processes. Strong analytical and problem-solving skills, with the ability to troubleshoot complex issues. Excellent communication and interpersonal skills, with the ability to collaborate effectively with diverse teams. Ability to work independently and as part of a team, demonstrating initiative and leadership. Preferred Qualifications: Certification(s) in Tableau. Experience with other data visualization tools and platforms. Familiarity with cloud platforms and services, such as Azure, Snowflake, Databricks. Knowledge of data governance and compliance standards. Experience in Agile project management methodologies. Qualifications Bachelor&aposs degree in Computer Science, Information Technology, or related field. Working Time: India 2nd Shift till 11:30 PM IST / 2:00 AM EST or 12:30 AM IST / 3:00AM EST Work Location: Remote Hyderabad/Chennai / Pune Show more Show less

Posted 3 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

mumbai, maharashtra, india

On-site

Job Title: Data Analyst JavaScript Coding, Google Apps Script, Excel Location: Ghatkopar, Mumbai (Work from Office) Salary: ?50,000 per month Experience: 25 years Employment Type: Full-Time Role Overview We are looking for a skilled Data Analyst with expertise in JavaScript coding , Google Apps Script , and Advanced Excel to design, automate, and visualize data for actionable insights. The role involves turning complex datasets into easy-to-understand dashboards, automating repetitive data workflows, and creating interactive visualizations that guide decision-making. Key Responsibilities Data Visualization & Coding: Develop dynamic, interactive charts and visual dashboards using JavaScript coding and popular libraries (e.g., Chart.js, D3.js). Automation: Use Google Apps Script to automate data extraction, transformation, and loading (ETL) processes for faster analysis. Dashboard Development: Build and maintain interactive Excel dashboards with advanced formulas, Pivot Tables, Macros, and Power Query. Perform data cleaning, validation, and error-checking to ensure accuracy. Translate data findings into clear insights for management and operational teams. Collaborate with cross-functional teams to identify reporting requirements and deliver timely outputs. Required Skills & Experience Proficiency in JavaScript coding for data visualization and interactivity. Strong command of Google Apps Script for automating reporting processes. Advanced Excel skills, including interactive dashboards, Macros, and Power Query. Experience working with large datasets and ensuring high data integrity. Analytical thinking and problem-solving skills. Strong attention to detail and ability to meet deadlines. What We Offer Competitive salary and growth opportunities. Challenging projects combining analytics, automation, and visualization. A collaborative environment where your ideas influence real business outcomes. Skills: automation,excel dashboards,data management,dashboards,mis reporting,reporting,google app script,data visualization,stakeholder engagement,dashboard builder,sql,advanced excel,communication skills,data handling,advanced formulas,dashboard automation,advanced excel skills,data accuracy,data analysis,microsoft excel,team management,dashboard creation,excel,javascript,coding experience,data,google apps script Show more Show less

Posted 3 weeks ago

Apply

6.0 - 8.0 years

3 - 15 Lacs

bengaluru, karnataka, india

On-site

6 to 8 Years of Relevant Experience Working on the Ansible automation of OS and DB provisioning Support SUSE Linux, Debian Linux, Oracle Linux and Red Hat Linux Operating systems Design, Develop, troubleshoots and debugs Linux servers for environment stability Determine hardware compactivity and/or influences hardware designs. Define and implement automation Strategy Develops scripts to automate repetitive tasks to reduce manual intervention and save efforts Proactive Health monitoring of applications / infrastructure through various tools Track and review service improvement plan ( automation , Process Improvements Critical Event Planning ( Upgrades , new enhancements , roll outs). Ensure the reliability, stability of Linux and PCA environment Constantly need to reevaluate how we implement cloud to stay relevant and safe Should mitigate any vulnerabilities reported on Linux environment and take necessary actions to mitigate them according to the criticality of the vulnerabilities reported to keep the environment secure Extensive experience in the implementation, administration, and configuration of RedHat Linux, Oracle Linux, Debian Linux & SUSE Linux environment. Should be technically competent enough in handling Severity calls and tickets and also customer requests and vendor handling exposure as well Ansible Scripting, Shell Scripting, Azure Platform Knowledge

Posted 3 weeks ago

Apply

6.0 - 8.0 years

3 - 15 Lacs

chennai, tamil nadu, india

On-site

6 to 8 Years of Relevant Experience Working on the Ansible automation of OS and DB provisioning Support SUSE Linux, Debian Linux, Oracle Linux and Red Hat Linux Operating systems Design, Develop, troubleshoots and debugs Linux servers for environment stability Determine hardware compactivity and/or influences hardware designs. Define and implement automation Strategy Develops scripts to automate repetitive tasks to reduce manual intervention and save efforts Proactive Health monitoring of applications / infrastructure through various tools Track and review service improvement plan ( automation , Process Improvements Critical Event Planning ( Upgrades , new enhancements , roll outs). Ensure the reliability, stability of Linux and PCA environment Constantly need to reevaluate how we implement cloud to stay relevant and safe Should mitigate any vulnerabilities reported on Linux environment and take necessary actions to mitigate them according to the criticality of the vulnerabilities reported to keep the environment secure Extensive experience in the implementation, administration, and configuration of RedHat Linux, Oracle Linux, Debian Linux & SUSE Linux environment. Should be technically competent enough in handling Severity calls and tickets and also customer requests and vendor handling exposure as well Ansible Scripting, Shell Scripting, Azure Platform Knowledge

Posted 3 weeks ago

Apply

1.0 - 5.0 years

2 - 6 Lacs

hyderabad, telangana, india

On-site

We are seeking a Data Specialist to join our team in India. The ideal candidate will have 1-5 years of experience in data analysis and management, with a strong ability to work with large datasets and derive actionable insights. Responsibilities Collect, analyze, and interpret large datasets to support business decision-making. Develop and maintain data systems and databases; this includes fixing coding errors and other data-related problems. Create data visualizations and reports to present findings to stakeholders. Ensure data quality and integrity by conducting regular audits and assessments. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights. Skills and Qualifications Proficiency in SQL for data querying and manipulation. Experience with data visualization tools such as Tableau, Power BI, or similar. Strong analytical and problem-solving skills with attention to detail. Knowledge of statistical analysis and data modeling techniques. Familiarity with programming languages such as Python or R for data analysis. Excellent communication skills to present data findings clearly and concisely.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

The Senior Data Analytics Consultant role requires a unique blend of analytical expertise, technological proficiency, and collaborative skills. Your primary responsibility will be to develop and implement data strategies that enhance business outcomes through innovative analytics and predictive modeling. You must be well-versed in SQL, Power BI, and Alteryx, with a solid understanding of data modeling, cloud data solutions, and data quality management practices. Leadership and stakeholder management skills are crucial for translating data insights into actionable business strategies effectively. With at least 3 years of experience in data analytics, including 2 years of hands-on experience with SQL, Alteryx, and Power BI, you will lead data analytics initiatives by leveraging advanced visualization and statistical techniques. Your expertise in Power BI, particularly in Power Query, Data Modeling, and Visualization, will be essential. Additionally, you should have a strong grasp of DAX for creating complex calculations and measures, as well as experience with Row Level Security (RLS) in Power BI. Your responsibilities will include spearheading data analytics projects, designing and maintaining data workflows using Alteryx, and collaborating with cross-functional teams to understand processes, challenges, and customer needs thoroughly. You will also play a key role in defining and visualizing key performance indicators (KPIs) to measure project success and stakeholder engagement. A Bachelor's degree in Computer Science, Engineering, Data Science, or related fields is required, with a preference for a Master's degree in Data Analytics, Data Science, Statistics, or relevant certifications such as CAP, DASCA, or Microsoft Certified: Data Analyst Associate. Programming skills and knowledge of cloud technologies are advantageous for this role. If you have a track record of strategic impact in data analytics, strong problem-solving abilities, and the ability to mentor junior analysts, you are the ideal candidate for this position.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Business Management Analyst at our Mumbai office, you will be responsible for designing and delivering critical senior management dashboards and analytics using tools such as Excel and SQL. Your primary task will be to create management packs that facilitate timely decision-making for various business units and establish a strong foundation for analytics. Collaboration with senior business managers, data engineers, and stakeholders from different teams will be essential to understand requirements and transform them into visually appealing dashboards and reports. Your role will involve analyzing business data and generating valuable insights for strategic ad hoc exercises. Under our flexible scheme, you can enjoy a host of benefits, including a best-in-class leave policy, gender-neutral parental leaves, childcare assistance reimbursement, sponsorship for industry certifications, and education, an Employee Assistance Program, comprehensive insurance coverage, and health screening for individuals above 35 years. Your key responsibilities will include gathering requirements from business users and managers, conducting ad hoc data analysis to produce reports and visualizations for strategic decision-making, sourcing information from multiple sources to build a robust data pipeline model, performing audit checks for data integrity, identifying process improvement opportunities based on data insights, and providing project status updates and recommendations. To excel in this role, you should have a Bachelor's degree in computer science, IT, Business Administration, or a related field, along with a minimum of 5 years of experience in visual reporting development. Proficiency in Microsoft Office, especially advanced Excel skills, a solid understanding of data visualization best practices, experience with data analysis, modeling, and ETL processes, strong SQL skills, analytical and problem-solving abilities, attention to detail, excellent communication skills, and the capability to manage multiple tasks and deadlines are essential. We provide training, coaching, and a culture of continuous learning to support your career development. At Deutsche Bank, we strive to create an inclusive work environment where everyone is empowered to excel together. Visit our company website for more information: [Deutsche Bank Company Website](https://www.db.com/company/company.htm).,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You have a minimum of 5 years of experience in Visualization and Data modelling using Tableau and Power BI, along with strong SQL skills. Your responsibilities will include integrating fully into a project team, collaborating with project stakeholders to gather BI requirements, and delivering high-quality business objects universe and reporting. You will review existing silo databases, build integrations, offer a holistic view of data, improve data visibility and quality, and enhance reporting database structures as needed. Additionally, you will provide guidance on database structure improvements, understand how integrated data sets can be utilized in Business Intelligence, and administer Business Objects processes and servers. It is essential to maintain collaborative working relationships with internal and external project stakeholders, effectively manage your workload to meet project timescales, and identify areas for improvement in collaboration with the Product Delivery Manager. You must have expertise in SQL for querying, function creation, and view/table creation, as well as proficiency in Tableau, Power BI, TSQL, Dimensional modelling, and SAP Business Objects product suite for reporting and universe design. Knowledge of ETL processes, data warehousing strategies, and other BI products, including cloud solutions, is required. Business and data analysis skills, along with proficiency in Tableau for Visualization, are essential for this role. If you are passionate about Business Intelligence, data modelling, and visualization, and possess the necessary technical skills and experience, we encourage you to apply by sending your resume to careers@savantyssolutions.com.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhereon premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it's no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. As a Senior Analytics Engineer at MongoDB, you will play a critical role in leveraging data to drive informed decision-making and simplify end user engagement across our most critical data sets. You will be responsible for designing, developing, and maintaining robust analytics solutions, ensuring data integrity, and enabling data-driven insights across all of MongoDB. This role requires an analytical thinker with strong technical expertise to contribute to the growth and success of the entire business. This role can be based out of Gurugram. Responsibilities - Design, implement, and maintain highly performant data post-processing pipelines - Create shared data assets that will act as the company's source-of-truth for critical business metrics - Partner with analytics stakeholders to curate analysis-ready datasets and augment the generation of actionable insights - Partner with data engineering to expose governed datasets to the rest of the organization - Make impactful contributions to our analytics infrastructure, systems, and tools - Create and manage documentation, and conduct knowledge sharing sessions to proliferate tribal knowledge and best practices - Maintain consistent planning and tracking of work in JIRA tickets Skills & Attributes - Bachelor's degree (or equivalent) in mathematics, computer science, information technology, engineering, or related discipline - 3-5 years of relevant experience - Strong Proficiency in SQL and experience working with relational databases - Solid understanding of data modeling and ETL processes - Proficiency in Python for automation, data manipulation, and analysis - Experience managing ETL and data pipeline orchestration with dbt and Airflow - Comfortable with command line functions - Familiarity with Hive, Trino (Presto), SparkSQL, Google BigQuery - Experience with cloud data storage like AWS S3, GCS - Experience with managing codebases with git - Consistently employs CI/CD best practices - Experience translating project requirements into a set of technical sub-tasks that build towards a final deliverable - Experience combining data from disparate data sources to identify insights that were previously unknown - Previous project work requiring expertise in business metrics and datasets - Strong communication skills to document technical processes clearly and lead knowledge-sharing efforts across teams - The ability to effectively collaborate cross-functionally to drive actionable and measurable results - Committed to continuous improvement, with a passion for building processes/tools to make everyone more efficient - A passion for AI as an enhancing tool to improve workflows, increase productivity, and generate smarter outcomes - A desire to constantly learn and improve themselves At MongoDB, we're committed to developing a supportive and enriching culture for everyone to drive personal growth and business impact. From employee affinity groups to fertility assistance and a generous parental leave policy, we value our employees" wellbeing and want to support them along every step of their professional and personal journeys. MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer.,

Posted 4 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform Engineer - Tech Lead at Deutsche Bank in Pune, India, you will be part of the DB Technology global team of tech specialists. Your role involves leading a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc, and data management to develop robust data pipelines, ensure data quality, and implement efficient data management solutions. Your leadership will drive innovation, maintain high standards in data infrastructure, and mentor team members to support data-driven initiatives. You will collaborate with data engineers, analysts, cross-functional teams, and stakeholders to ensure the data platform meets the organization's needs. Your responsibilities include working on a hybrid data platform to unlock new insights and drive business growth. You will contribute to all stages of software delivery, from initial analysis to production support, within a cross-functional agile delivery team. Key Responsibilities: - Lead a cross-functional team in designing, developing, and implementing on-prem and cloud-based data solutions. - Provide technical guidance and mentorship to foster continuous learning and improvement. - Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. - Architect and implement scalable, efficient, and reliable data management solutions for complex data workflows and analytics. - Evaluate tools, technologies, and best practices to enhance the data platform. - Drive adoption of microservices, containerization, and serverless architectures. - Establish and enforce best practices in coding, testing, and deployment. - Oversee code reviews and provide feedback to promote code quality and team growth. Skills and Experience: - Bachelor's or Master's degree in Computer Science, Engineering, or related field. - 7+ years of software engineering experience with a focus on Big Data and GCP technologies. - Strong leadership skills with experience in mentorship and team growth. - Expertise in designing and implementing data pipelines, ETL processes, and real-time data processing. - Hands-on experience with Hadoop ecosystem tools and Google Cloud Platform services. - Understanding of data quality management and best practices. - Familiarity with containerization and orchestration tools. - Strong problem-solving and communication skills. Deutsche Bank offers a culture of continuous learning, training, and development to support your career progression. You will receive coaching and support from experts in your team and benefit from a range of flexible benefits tailored to your needs. Join us in creating innovative solutions and driving business growth at Deutsche Bank.,

Posted 4 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

You are an experienced and highly skilled Senior AWS Data Engineer with over 8+ years of experience, ready to join our dynamic team. Your deep understanding of data engineering principles, extensive experience with AWS services, and proven track record of designing and implementing scalable data solutions make you the ideal candidate for this role. Your key responsibilities will include designing and implementing robust, scalable, and efficient data pipelines and architectures on AWS. You will develop data models and schemas to support business intelligence and analytics requirements, utilizing AWS services such as S3, Redshift, EMR, Glue, Lambda, and Kinesis to build and optimize data solutions. It will be your responsibility to implement data security and compliance measures using AWS IAM, KMS, and other security services, as well as design and develop ETL processes to ingest, transform, and load data from various sources into data warehouses and lakes. Ensuring data quality and integrity through validation, cleansing, and transformation processes, optimizing data storage and retrieval performance through indexing, partitioning, and other techniques, and monitoring and troubleshooting data pipelines for high availability and reliability will also be part of your role. Collaboration with cross-functional teams, providing technical leadership and mentorship to junior data engineers, identifying opportunities to automate and streamline data processes, and participating in on-call rotations for critical systems and services are also expected from you. Your required qualifications, capabilities, and skills include experience in software development and data engineering, with hands-on experience in Python and PySpark. You should have proven experience with cloud platforms such as AWS, Azure, or Google Cloud, a good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts, and experience with cloud native ETL platforms like Snowflake and/or Databricks. Proven experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3, efficient Cloud DevOps practices, and CI/CD tools like Jenkins/Gitlab for data engineering platforms, as well as good knowledge of SQL and NoSQL databases including performance tuning and optimization, and experience with declarative infra provisioning tools like Terraform, Ansible, or CloudFormation will be valuable assets. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively, are also necessary for this role. Preferred qualifications, capabilities, and skills that would be beneficial for this role include knowledge of machine learning model lifecycle, language models, and cloud-native MLOps pipelines and frameworks, as well as familiarity with data visualization tools and data integration patterns.,

Posted 4 weeks ago

Apply

1.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Associate Full Stack Software Engineer at Amgen in Hyderabad, India, you will be an integral part of a team focused on designing, developing, and maintaining software applications and solutions to meet business needs. Your role will involve collaborating with product managers, designers, data engineers, and other team members to create high-quality, scalable software solutions and ensure the availability and performance of critical systems. One of the key initiatives you will be involved in is a regulatory submission content automation project aimed at modernizing and digitizing the regulatory submission process. This project will showcase innovative technologies such as Generative AI, Structured Content Management, and integrated data to automate the creation and management of regulatory content, positioning Amgen as a leader in regulatory innovation. Your responsibilities will include rapidly prototyping concepts into working code, contributing to both front-end and back-end development using cloud technology, and developing innovative solutions using generative AI technologies. You will be expected to ensure code quality and consistency, create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations, as well as identify and resolve technical challenges effectively. Additionally, you will play a crucial role in analyzing and understanding functional and technical requirements, translating them into software architecture and design specifications, and implementing testing strategies to ensure software quality. Collaboration with multi-functional teams, customization of modules to meet specific business requirements, integration with other systems and platforms, and providing ongoing support and maintenance for applications will also be part of your responsibilities. To excel in this role, you are required to have a Masters degree with 1 to 3 years of experience, a Bachelors degree with 3 to 5 years of experience, or a Diploma with 7 to 9 years of experience in Computer Science, IT, or related fields. Proficiency in Python/PySpark development, Fast API, PostgreSQL, Databricks, DevOps Tools, CI/CD, Data Ingestion, and knowledge of HTML, CSS, JavaScript, React, Angular, data engineering concepts, ETL processes, cloud computing principles, software development methodologies, and version control systems like Git are essential. Preferred qualifications include familiarity with cloud platforms (e.g., AWS, GCP, Azure), containerization technologies (e.g., Docker, Kubernetes), monitoring and logging tools (e.g., Prometheus, Grafana, Splunk), and data processing tools like Hadoop, Spark, or similar. Strong problem-solving, analytical, communication, and interpersonal skills, as well as experience with API integration, serverless, microservices architecture, and SQL/NOSQL databases are highly beneficial. At Amgen, you can expect a competitive base salary and comprehensive Total Rewards Plans aligned with industry standards. The company is dedicated to fostering an inclusive environment where diverse, ethical, committed, and accomplished individuals respect each other and uphold Amgen values to advance science and serve patients. Amgen ensures that individuals with disabilities are provided reasonable accommodations throughout the job application process and during employment. Join the Amgen team to make a lasting impact on the lives of patients while advancing your career in a collaborative and innovative culture.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for collaborating with clients and stakeholders to gather and understand business requirements. You will then translate these business needs into technical specifications for MicroStrategy BI solutions. Additionally, you will design and develop MicroStrategy reports, dashboards, and interactive visualizations, utilizing MicroStrategy features to create efficient and user-friendly BI solutions. As part of your role, you will define and implement data models that support reporting and analytics requirements while ensuring data accuracy, integrity, and optimal performance within MicroStrategy. You will also be tasked with optimizing MicroStrategy reports and queries for improved performance, identifying and implementing best practices to enhance overall system efficiency. Client collaboration is a key aspect of this position, where you will work closely with clients to demonstrate MicroStrategy capabilities and gather feedback. Providing training and support to end-users to ensure the effective use of MicroStrategy solutions will also be part of your responsibilities. Integration of MicroStrategy with various data sources and third-party applications, as needed, and collaborating with IT teams to ensure seamless data flow between systems is crucial. Furthermore, you will design and implement security models within the MicroStrategy environment, defining user roles, access controls, and data security measures. Creating and maintaining documentation for MicroStrategy solutions, configurations, and best practices will also be essential to ensure knowledge transfer and documentation for future reference. It is imperative to stay updated on the latest MicroStrategy features and updates, evaluating and recommending new technologies to enhance BI capabilities. As for qualifications, a Bachelor's degree in Computer Science, Information Technology, or a related field is required. Proven experience as a MicroStrategy Consultant with expertise in MicroStrategy architecture and development is also necessary. A strong understanding of BI concepts, data modeling, and data warehousing, along with proficiency in SQL to write complex queries for data analysis, is essential. Excellent problem-solving and analytical skills, as well as strong communication and interpersonal skills for client interactions, are also required. Preferred skills include MicroStrategy certification, experience with other BI tools such as Tableau, Power BI, or QlikView, knowledge of data visualization best practices, and familiarity with ETL processes and tools. Additionally, having one of the following certifications is considered a plus: MicroStrategy Certified Master Analyst (MCMA) Certification, MicroStrategy Certified Specialist Developer (MCSD) Certification, MicroStrategy Certified Master Developer (MCSD) Certification, or MicroStrategy Certified Developer (MCD) Certification.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be working as a Data Migration Engineer at Worldline where your primary responsibility will be to handle data migration tasks efficiently. Your role will involve utilizing your technical skills in Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Cloud SQL, Cloud Spanner, and Google Cloud Dataflow. It is essential to have knowledge of GCP's identity and access management (IAM) for ensuring data security during migration. Experience with data migration tools like Google Cloud Transfer Service, Data Transfer Service, as well as third-party tools like Talend, Apache NiFi, and Informatica will be beneficial. Familiarity with GCP's Database Migration Service (DMS) for migrating databases to GCP is also required. You should have hands-on experience with both relational databases (e.g., MySQL, PostgreSQL, Oracle) and NoSQL databases (e.g., Bigtable, Firestore) along with expertise in data modeling and schema design to optimize data migration. Designing and implementing ETL (Extract, Transform, Load) pipelines using tools such as Google Cloud Dataflow, Apache Beam, or Cloud Composer will be part of your responsibilities. Proficiency in SQL for querying and transforming data is necessary, and familiarity with programming languages like Python or Java for developing migration scripts and automating data workflows is preferred. Understanding GCP networking concepts, storage solutions, data formats, and protocols is essential for effective data import/export. You will also be expected to have experience with monitoring and logging tools such as Stackdriver to track data migration progress and troubleshoot any arising issues. Knowledge of data governance, compliance regulations (e.g., GDPR, HIPAA), and security best practices related to data migration is crucial. Familiarity with version control systems like Git and CI/CD tools is required for managing and automating deployment processes. In addition to technical skills, strong problem-solving abilities, communication skills, and project management experience are important for collaborating effectively with stakeholders and ensuring successful data migration outcomes. At Worldline, we value diversity, inclusion, and innovation, fostering a workplace where everyone feels empowered to contribute authentically. Extensive training, mentorship, and development programs are provided to support your growth and help you make a meaningful impact. Join us at Worldline and be a part of a team that believes in promoting diversity and inclusion to drive innovation and success. Explore more about life at Worldline at Jobs.worldline.com. We are an Equal Opportunity Employer. Request ID: 300284 Posting Start Date: 7/31/25 Job Area: Technology Work Site: Hybrid Contract Type: Permanent Brand: Worldline Job Location: India - Pune #LI-NY1,

Posted 4 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a skilled and experienced Adobe Experience Platform (AEP) Developer to join our team. The ideal candidate should have prior experience with Adobe Experience Platform and other Adobe Marketing Cloud solutions like Adobe Target, Adobe Analytics, and Adobe Campaign. We are seeking someone who has expertise in digital targeting and marketing automation technologies. As an AEP Developer, your responsibilities will include analyzing business requirements and translating them into comprehensive AEP solutions. You will need to develop detailed business use cases to demonstrate how AEP functionalities can address specific business needs and enhance customer experiences. Utilizing your expertise in Adobe Experience Platform, you will design, develop, and implement end-to-end business use cases that align with business objectives. Your role will also involve designing and implementing efficient ETL processes and integrations with other systems to ensure a continuous flow of data for marketing activities. Experience with Adobe Journey Optimizer for accurate tracking, targeting, and messaging will be essential. Additionally, you will be responsible for monitoring the performance of AEP solutions, identifying and resolving issues, troubleshooting technical challenges, and optimizing workflows for enhanced efficiency. It is crucial to stay updated with the latest advancements in Adobe Experience Platform and related technologies for continuous improvement. To qualify for this role, you should have a Bachelors or Masters degree in Computer Science, Information Technology, or a related field. You should have at least 4 years of experience in developing and implementing marketing automation solutions, with a focus on Adobe Marketing Cloud and Adobe Experience Platform services. Technical developer certifications across relevant products and expertise in core web technologies are required. You should also have experience in AEP data collection, ingestion, ETL workflows, data preparation development, data governance, data activation, performance optimization, data quality assurance, and AJO journey creation and measurement. Join our dynamic team and contribute to enhancing our marketing operations by leveraging the power of Adobe Experience Platform to deliver personalized experiences to our customers. In addition to the job description, we offer a gender-neutral policy, 18 paid holidays throughout the year, generous parental leave, flexible work arrangements, and employee assistance programs to support your wellness and well-being. Publicis Sapient is a digital transformation partner that helps organizations transition to a digitally-enabled state. Our team combines expertise in technology, data sciences, consulting, and customer experience to accelerate our clients" businesses through innovative solutions. If you are passionate about digital transformation and customer-centric solutions, we invite you to join our team and be part of our mission to help people thrive in the pursuit of next.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Data Replication Specialist, you will be responsible for designing, developing, and implementing data replication solutions using Qlik Replicate. Your key duties will include configuring and maintaining Qlik Replicate environments, managing end-to-end data replication processes, and working with a variety of data sources and target systems such as Oracle, SQL Server, MySQL, SAP, data warehouses, and cloud platforms. You will also be tasked with implementing Change Data Capture (CDC) mechanisms to capture and replicate real-time data changes, monitoring and troubleshooting replication processes to ensure data integrity, and optimizing replication performance through techniques like performance tuning and indexing. Collaboration with data architects, database administrators, and ETL developers to integrate data replication solutions with overall data management strategies is essential. To succeed in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field and have a minimum of 4 years of hands-on experience with Qlik Replicate. A strong understanding of database concepts, data warehousing principles, and ETL processes is required, along with proficiency in SQL and experience working with database management systems like Oracle and SQL Server. Knowledge of Change Data Capture (CDC) techniques and familiarity with data integration tools and technologies will be beneficial. Additionally, you will be responsible for developing and maintaining technical documentation related to Qlik Replicate configurations, processes, and best practices, providing support and training to other team members on Qlik Replicate, and staying up-to-date with the latest features and industry trends of Qlik Replicate.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The role at Prudential involves performing strategic analysis of structured and unstructured data from various sources. You will be responsible for developing data structures and pipelines to organize, collect, cleanse, and standardize data to generate actionable insights and address reporting needs. Your role will also include defining data requirements, gathering and validating information, and supporting the creation of Data Quality rules for formal governance. Additionally, you will identify innovative opportunities to develop data insights, maximize data usage, and improve business performance. Acting as a Subject Matter Expert, you will provide complex Business Intelligence solutions involving SQL and Python, guiding junior team members and engaging extensively with users to understand and translate their requirements into user-friendly dashboards and insights. In this position, you will be expected to up-skill continuously, venture into Advanced Analytics, and become familiar with data science languages like Python and R. You will also be responsible for identifying and managing risks within your area of responsibility, including resolving blockers and bottlenecks. To be successful in this role, you should hold a university degree in Computer Science, Data Analysis, or a related field, along with a minimum of 4 years" experience as a data analyst. Experience in analyzing mobile applications data, preparing business reports, and possessing exceptional analytical skills are essential. You should have a good understanding of the power and value of data, the ability to apply technology solutions to meet business needs, and assess stakeholder requirements to enhance customer experience. Moreover, you must display resilience under pressure, provide high-quality solutions, meet deadlines consistently, and effectively handle requests and queries from senior management. Technical requirements include demonstrable experience in data-related roles, knowledge of ETL processes, data warehousing principles, and expertise in data visualization using advanced Tableau skills. Proficiency in SQL, Python, or Scala is necessary, along with familiarity with business tools like JIRA and Confluence. This role demands flexibility to work with various technologies and a commitment to continuous learning and development.,

Posted 1 month ago

Apply

5.0 - 16.0 years

0 Lacs

karnataka

On-site

You should have a minimum of 7 to 16 years of experience in Airflow & ETL processes with a strong background in Python development. The primary skills required for this role include GCP, Airflow, and Python. As a Data Engineer with expertise in cloud technologies, particularly GCP, you will be responsible for developing data pipelines and ETL processes. Your role will involve working with cloud storage and other data engineering services within the GCP environment. To excel in this position, you must possess strong programming skills in Python, Pyspark, SQL/BigQuery, and other databases. Experience in Test Driven Development, building libraries, and proficiency in Pandas, NumPy, GCP, Elasticsearch, and BigQuery are highly desirable. Advanced SQL knowledge is preferred, while a basic understanding is required for this role. If you have a solid background in data engineering, cloud services, and a passion for developing data pipelines, this opportunity in Noida is ideal for you.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

punjab

On-site

As the Chief AI Officer (CAIO) at RChilli, located in Mohali, you will play a pivotal role in shaping the AI strategy, innovation, and ethical deployment in HR Tech. With over 10 years of experience in AI/ML and leadership roles, you will lead the AI research and development initiatives, ensuring alignment with business objectives and compliance with industry regulations. Your responsibilities will include developing and executing the AI strategy, driving product innovation through NLP, machine learning, and predictive analytics, and overseeing the implementation of AI-powered solutions such as automated job descriptions, resume scoring, and chatbots. You will also be responsible for defining a scalable AI roadmap, managing AI infrastructure, and identifying opportunities for AI implementation to enhance operational efficiency. In addition to your technical expertise in AI, data science, and related fields, you will need to possess strong leadership and business skills. Your ability to align AI innovation with business goals, communicate effectively with stakeholders, and build and mentor an AI team will be crucial for success in this role. As a thought leader in the industry, you will represent RChilli in AI forums and conferences, staying ahead of AI trends and advancements in HRTech. By joining RChilli, you will have the opportunity to lead AI innovation, drive impactful work in HR operations and talent acquisition, and work with a passionate AI research and product team. In return, you will enjoy a competitive salary, benefits, and career growth opportunities. If you are a visionary AI leader ready to transform the HRTech industry, seize this opportunity to join RChilli as our Chief AI Officer and make a significant impact on the future of AI-driven HR solutions.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

jaisalmer, rajasthan

On-site

You are a group of industrial companies based in Saudi Arabia and currently looking for a talented Data Analyst to join your team. As a Data Analyst, you will play a crucial role in analyzing data and generating reports using tools such as Power BI, Visio, and R. Your main responsibilities will include collaborating with stakeholders to understand business requirements, designing and implementing ETL processes, conducting predictive analysis, and supporting project management activities related to data analytics projects. To excel in this role, you should have proven experience in business analysis and data analytics, along with proficiency in tools like Power BI, Visio, R, and ETL processes. Strong analytical skills and knowledge of database design principles are essential for this position. Additionally, excellent communication skills are required as you will be collaborating with cross-functional teams. This role offers an exciting opportunity to work on diverse projects and contribute to data-driven decision-making processes. If you meet the above conditions and are passionate about data and analytics, we encourage you to send your updated CV to: careersfz@gmail.com. This is a full-time position with health insurance benefits included. The work location for this role is in person.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a Senior Data Platform Engineer responsible for leading the design, development, and optimization of the data platform infrastructure. Your primary focus will be on driving scalability, reliability, and performance across data systems to enable data-driven decision-making at scale. Working closely with data engineers, analysts, and product teams, you will play a crucial role in enhancing the overall data platform. Your responsibilities will include architecting and implementing scalable, secure, and high-performance data platforms on AWS cloud using Databricks. You will be building and managing data pipelines and ETL processes utilizing modern data engineering tools such as AWS RDS, REST APIs, and S3 based ingestions. Monitoring and maintaining production data pipelines, along with working on enhancements, will be essential tasks. Optimizing data systems for improved performance, reliability, and cost efficiency will also fall under your purview. Implementation of data governance, quality, and observability best practices in line with Freshworks standards will be a key focus area. Collaboration with cross-functional teams to support diverse data needs is also a critical aspect of this role. Qualifications for this position include a Bachelor's/Masters degree in Computer Science, Information Technology, or a related field. You should have good exposure to data structures and algorithms, coupled with proven backend development experience using Scala, Spark, or Python. A strong understanding of REST API development, web services, and microservices architecture is essential. Experience with Kubernetes and containerized deployment is considered a plus. Proficiency in working with relational databases like MySQL, PostgreSQL, or similar platforms is required. A solid understanding and hands-on experience with AWS cloud services are also important. Knowledge of code versioning tools such as Git and Jenkins is necessary. Excellent problem-solving skills, critical thinking, and keen attention to detail will be valuable assets in this role.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Full Time WFO MicroStrategy Consultant with 3+ years of experience in Nagpur/Pune, your main responsibility will be to collaborate with clients and stakeholders to gather and understand business requirements. You will then translate these business needs into technical specifications for MicroStrategy BI solutions. Additionally, you will design and develop MicroStrategy reports, dashboards, and interactive visualizations, utilizing MicroStrategy features to create efficient and user-friendly BI solutions. Another key aspect of your role will involve defining and implementing data models that support reporting and analytics requirements. You will ensure data accuracy, integrity, and optimal performance within MicroStrategy. Performance optimization will also be crucial, as you will need to optimize MicroStrategy reports and queries for improved performance and implement best practices to enhance overall system efficiency. Client collaboration is an essential part of this role, as you will work closely with clients to demonstrate MicroStrategy capabilities, gather feedback, and provide training and support to end-users. Integration of MicroStrategy with various data sources and third-party applications, as well as collaboration with IT teams to ensure seamless data flow between systems, will also be part of your responsibilities. Moreover, you will design and implement security models within the MicroStrategy environment, defining user roles, access controls, and data security measures. Documentation of MicroStrategy solutions, configurations, and best practices will be necessary to ensure knowledge transfer and documentation for future reference. To excel in this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a MicroStrategy Consultant. Strong understanding of BI concepts, data modeling, data warehousing, proficiency in SQL, excellent problem-solving and analytical skills, and strong communication and interpersonal skills for client interactions are also essential qualifications. Preferred skills for this role include MicroStrategy certification, experience with other BI tools such as Tableau, Power BI, or QlikView, knowledge of data visualization best practices, and familiarity with ETL processes and tools. Staying updated on the latest MicroStrategy features and updates and evaluating and recommending new technologies to enhance BI capabilities will also be beneficial for this position.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

This is a contract role based on-site in Mumbai where you will be responsible for designing and implementing data solutions. Your tasks will include data modeling, developing ETL processes, and managing data warehousing. It is essential to ensure data quality and consistency, optimize data workflows, and support business needs through data analytics. To excel in this role, you should have relevant work experience in analytics, reporting, and business intelligence tools. Proficiency in Tableau, SQL, ETL processes, data warehousing solutions, and working with databases in a business setting with medium to large-scale datasets is crucial. An analytical mindset with problem-solving skills is necessary, along with the ability to collaborate closely with stakeholders to establish dashboards. Understanding customer requirements and designing solutions within an AWS Business Intelligence environment will be part of your responsibilities. Strong communication skills and experience in customer interactions are essential for this role.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies