Jobs
Interviews

1016 Etl Process Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

13 - 17 Lacs

Mumbai

Work from Office

Senior Manager - Data Analytics About the job: Experience5-8 Years LocationMumbai-Andheri Employment TypeFull-Time About Us: We are a dynamic fintech firm dedicated to revolutionizing the financial services industry through innovative data solutions. We believe in leveraging cutting-edge technology to provide superior financial products and services to our clients. Join our team and be a part of this exciting journey. Job Overview: - We are looking for a skilled Data Engineer with 3-5 years of experience to join our data team. - The ideal candidate will have a strong background in ETL processes, data pipeline creation, and database management. - As a Data analytics, you will be responsible for designing, developing, and maintaining scalable data systems and pipelines. Key Responsibilities: - Design and develop robust and scalable ETL processes to ingest and process large datasets from various sources. - Build and maintain efficient data pipelines to support real-time and batch data processing. - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. - Optimize database performance and ensure data integrity and security. - Troubleshoot and resolve data-related issues and provide support for data operations. - Implement data quality checks and monitor data pipeline performance. - Document technical solutions and processes for future reference. Required Skills And Qualifications: - Bachelor's/Master's degree in Engineering, or a related field. - 3-5 years of experience in data engineering or a related role. - Strong proficiency in ETL tools and techniques. - Experience with SQL and relational databases (MySQL, PostgreSQL). - Familiarity with big data technologies. - Proficiency in programming languages such as Python, Java, or Scala. - Knowledge of data warehousing concepts and tools. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills. Preferred Qualifications: - Experience with data visualization tools (Tableau, Power BI). - Knowledge of machine learning and data science principles. - Experience with real-time data processing and streaming platforms (Kafka). What We Offer - Competitive compensation package (10-15 LPA) based on experience and qualifications. - Opportunity to work with a talented and innovative team in the fintech industry. - Professional development and growth opportunities. Apply Save Save Pro Insights

Posted 2 months ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Mumbai

Work from Office

Data Analyst About the job: About The Role : We are seeking a highly motivated and detail-oriented Data Analyst with 2 to 4 years of work experience to join our team. The ideal candidate will have a strong analytical mindset, excellent problem-solving skills, and a passion for transforming data into actionable insights. In this role, you will play a pivotal role in gathering, analyzing, and interpreting data to support informed decision-making and drive business growth. Key Responsibilities: Data Collection and Extraction:- Gather data from various sources, including databases, spreadsheets and APIs.- Perform data cleansing and validation to ensure data accuracy and integrity. Data Analysis:- Analyze large datasets to identify trends, patterns, and anomalies.- Conduct analysis and data modeling to generate insights and forecasts.- Create data visualizations and reports to present findings to stakeholders. Data Interpretation and Insight Generation:- Translate data insights into actionable recommendations for business improvements.- Collaborate with cross-functional teams to understand data requirements and provide data-driven solutions. Data Quality Assurance:- Implement data quality checks and validation processes to ensure data accuracy and consistency.- Identify and address data quality issues promptly.Qualifications:- Bachelor's degree in a relevant field such as Computer Science, Statistics, Mathematics, or a related discipline.- Proven work experience as a Data Analyst, with 2 to 4 years of relevant experience.- Knowledge of data warehousing concepts and ETL processes is advantageous.- Proficiency in data analysis tools and languages (SQL, Python, R).- Experience with data visualization tools (Tableau, Power BI) is a plus.- Strong analytical and problem-solving skills.- Excellent communication and presentation skills.- Attention to detail and a commitment to data accuracy.- Familiarity with machine learning and predictive modeling is a bonus.If you are a data-driven professional with a passion for uncovering insights from complex datasets and have the qualifications and skills mentioned above, we encourage you to apply for this Data Analyst position. Join our dynamic team and contribute to making data-driven decisions that will shape our company's future.Fatakpay is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Skills: SQL, Python and ETLApplySaveSaveProInsights

Posted 2 months ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Mumbai

Work from Office

Job TitleData Analyst - Fintech - We are seeking a highly motivated and detail-oriented Data Analyst with 2 to 4 years of work experience to join our team. - The ideal candidate will have a strong analytical mindset, excellent problem-solving skills, and a passion for transforming data into actionable insights. - In this role, you will play a pivotal role in gathering, analyzing, and interpreting data to support informed decision-making and drive business growth. Key Responsibilities: Data Collection and Extraction:- Gather data from various sources, including databases, spreadsheets and APIs,- Perform data cleansing and validation to ensure data accuracy and integrity. Data Analysis:- Analyze large datasets to identify trends, patterns, and anomalies.- Conduct analysis and data modeling to generate insights and forecasts.- Create data visualizations and reports to present findings to stakeholders. Data Interpretation and Insight Generation:- Translate data insights into actionable recommendations for business improvements.- Collaborate with cross-functional teams to understand data requirements and provide data-driven solutions. Data Quality Assurance:- Implement data quality checks and validation processes to ensure data accuracy and consistency.- Identify and address data quality issues promptly.Qualifications- Master's degree in a relevant field such as Computer Science, Statistics, Mathematics, or a related discipline.- Proven work experience as a Data Analyst, with 2 to 4 years of relevant experience.- Knowledge of data warehousing concepts and ETL processes is advantageous.- Proficiency in data analysis tools and languages (e. , SQL, Python, R). Experience with data visualization tools (e. , Tableau, Power BI) is a plus.- Strong analytical and problem-solving skills.- Excellent communication and presentation skills.ApplySaveSaveProInsights

Posted 2 months ago

Apply

5.0 - 8.0 years

14 - 15 Lacs

Gurugram

Work from Office

Immediate openings for the position of Senior Data Engineer / Technical Lead for one of the reputed company Mynd Integrated Solutions located in Gurgaon Sector 68 Key Skills: 68 years of experience in data engineering, analytics, or related fields . Hands-on expertise in SQL, Python , and modern ETL frameworks (Airflow, dbt, etc.). Proven experience with cloud data platforms like Snowflake, Redshift, or BigQuery. Strong understanding of data modeling, warehousing, and performance optimization . Familiarity with data governance , compliance frameworks (e.g., ISO 27701, GDPR). Experience in delivering dashboards via Power BI, Tableau, or Looker . Excellent communication and stakeholder management skills. Notice Period: Immediate joiners are preferred Experience: 6-8 Years Qualification: Any Graduation CTC that we can offer: As per the market standards, It is work from office from Day 1 (5 days working) Job Location: Gurgaon Sector 68 Interested and serious candidates can send me your updated CV on vishnu.peramsetty@myndsol.com Feel free to contact me for further clarifications if any -- Vishnu Vardhan - 8332951064 Job Title: Senior Data Engineer / Technical Lead Experience: 68 Years Location: Gurgaon Reporting To: Head of Data Business / Chief Data Officer (CDO) Role Overview: We are looking for a dynamic and experienced Senior Data Engineer / Technical Lead to spearhead our foundational data initiatives. This role combines hands-on engineering, strategic thinking, and team leadership to establish scalable infrastructure, implement robust data governance, and deliver actionable analytics for both internal operations and SaaS customers. Key Responsibilities: 1. Leadership & Strategy Define and drive the data engineering vision, architecture, and roadmap. Translate business needs into scalable and performant data infrastructure. Lead a small team (4-5 members) of junior data engineers/analysts. 2. Data Infrastructure & Integration Design, build, and maintain reliable data pipelines and ETL processes . Integrate multiple data sources into a unified data warehouse (e.g., Snowflake, Redshift, BigQuery). Ensure scalable and secure infrastructure for real-time and batch processing. 3. Analytics & Dashboard Delivery Collaborate with product and business stakeholders to identify key KPIs. Deliver initial and ongoing analytics dashboards for internal stakeholders and external SaaS clients. Support productization of analytics and insights in client-facing interfaces. 4. Data Governance & Compliance Implement and monitor data quality checks, access policies, and retention standards. Work closely with compliance teams to ensure alignment with ISMS/PIMS standards. Conduct periodic internal audits and support external compliance reviews. 5. Team Enablement & Mentorship Provide technical guidance, code reviews, and mentoring to junior team members. Foster a culture of continuous improvement, learning, and documentation. Required Skills & Qualifications: 6–8 years of experience in data engineering, analytics, or related fields . Hands-on expertise in SQL, Python , and modern ETL frameworks (Airflow, dbt, etc.). Proven experience with cloud data platforms like Snowflake, Redshift, or BigQuery. Strong understanding of data modeling, warehousing, and performance optimization . Familiarity with data governance , compliance frameworks (e.g., ISO 27701, GDPR). Experience in delivering dashboards via Power BI, Tableau, or Looker . Excellent communication and stakeholder management skills. Preferred: Experience in a SaaS or multi-tenant analytics environment. Exposure to DevOps for Data , CI/CD, and Infrastructure-as-Code tools. Certification in cloud platforms (AWS, GCP, or Azure) or data privacy standards.

Posted 2 months ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Mumbai

Work from Office

Job Summary : We are seeking a highly experienced Unica Architect with 8+ years of expertise in designing, implementing, and optimizing Unica solutions. The ideal candidate will play a key role in architecting scalable and high-performing Unica-based marketing automation platforms, integrating Unica with enterprise systems, and driving best practices in campaign management, data strategy, and customer engagement. Key Responsibilities : Solution Architecture & Design : - Design and develop end-to-end Unica Campaign, Unica Interact, Unica Journey, and Unica Plan solutions. - Define best practices for campaign orchestration, segmentation, and personalization using Unica. - Architect scalable Unica environments ensuring high performance, reliability, and security. - Collaborate with business stakeholders, marketing teams, and IT teams to understand requirements and translate them into Unica-based solutions. - Architect scalable Unica environments ensuring high performance, reliability, and security. Integration & Data Strategy : - Design seamless integration between Unica and CRM, data warehouses, digital platforms, and real-time decision systems. - Optimize ETL processes, audience segmentation strategies, and data flows for effective campaign execution. - Ensure proper data governance, compliance (GDPR, CCPA), and security policies within Unica solutions. Unica Administration & Performance Optimization : - Lead the installation, configuration, and maintenance of Unica applications. - Optimize system performance, troubleshoot issues, and ensure high availability. - Manage application upgrades, patches, and environment migrations for Unica platforms. - Maintain the overall Unica system and perform regular housekeeping in an automated way. - Ensure all the integrated surrounding systems are functioning as expected. - Collaborating with relevant stakeholders from the client side to maintain the environment's stability. - Configuring notification facilities to the relevant stakeholders to make them aware of any system/functional failure that happens. Campaign Strategy & Execution Support : - Guide campaign teams in setting up and executing complex multi-channel marketing campaigns. - Define audience segmentation models, personalization strategies, and A/B testing frameworks. - Ensure seamless customer journey automation and omnichannel engagement. Leadership & Stakeholder Management : - Act as the subject matter expert (SME) and advisor for Unica implementations. - Mentor and train junior Unica developers and campaign specialists. - Engage with business leaders, marketing teams, and IT teams to drive Unica adoption and roadmap. Required Skills & Experience : - 8+ years of experience in HCL Unica Platform, Campaign, Plan, Interact, Journey, Deliver, Link, and related modules. - Deep understanding of campaign management, audience segmentation, and personalization. - Expertise in SQL, database design, and ETL processes for Unica. - Hands-on experience in Unica upgrades, migrations, and performance tuning. - Strong knowledge of REST APIs, SOAP APIs, and integrations with third-party tools (CRM, CDP, Web Analytics). - Experience in cloud deployments (AWS, Azure, GCP) for Unica. - Experience with data integration tools and techniques (e.g., ETL, APIs, data lakes). - Familiarity with web technologies (HTML, CSS, JavaScript) and CMS platforms. - Working knowledge of programming languages such as Java, PHP, and Python. Experience in developing, debugging, and maintaining code in these languages is preferred. - Experience with scripting languages like Bash, Shell, or PowerShell for automation and system management tasks. - Strong understanding of SQL (Structured Query Language) and experience in working with relational databases such as MySQL, PostgreSQL, or similar, and experience in working with stored procedures. - Proficiency in Linux operating systems, networking concepts, and server administration tasks. - Excellent problem-solving and analytical skills. - Strong communication and interpersonal skills. - Ability to work independently and as part of a team. - Detail-oriented with strong organizational skills. The job is for Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Coimbatore

Work from Office

About Responsive Responsive, formerly RFPIO, is the market leader in an emerging new category of SaaS solutions called Strategic Response Management Responsive customers including Google, Microsoft, Blackrock, T Rowe Price, Adobe, Amazon, Visa and Zoom are using Responsive to manage business critical responses to RFPs, RFIs, RFQs, security questionnaires, due diligence questionnaires and other requests for information Responsive has nearly 2,000 customers of all sizes and has been voted ?best in class? by G2 for 13 quarters straight It also has more than 35% of the cloud SaaS leaders as customers, as well as more than 15 of the Fortune 100 Customers have used Responsive to close more than $300B in transactions to-date, About The Role We are seeking a highly skilled Product Data Engineer with expertise in building, maintaining, and optimizing data pipelines using Python scripting The ideal candidate will have experience working in a Linux environment, managing large-scale data ingestion, processing files in S3, and balancing disk space and warehouse storage efficiently This role will be responsible for ensuring seamless data movement across systems while maintaining performance, scalability, and reliability, Essential Functions ETL Pipeline Development: Design, develop, and maintain efficient ETL workflows using Python to extract, transform, and load data into structured data warehouses, Data Pipeline Optimization: Monitor and optimize data pipeline performance, ensuring scalability and reliability in handling large data volumes, Linux Server Management: Work in a Linux-based environment, executing command-line operations, managing processes, and troubleshooting system performance issues, File Handling & Storage Management: Efficiently manage data files in Amazon S3, ensuring proper storage organization, retrieval, and archiving of data, Disk Space & Warehouse Balancing: Proactively monitor and manage disk space usage, preventing storage bottlenecks and ensuring warehouse efficiency, Error Handling & Logging: Implement robust error-handling mechanisms and logging systems to monitor data pipeline health, Automation & Scheduling: Automate ETL processes using cron jobs, Airflow, or other workflow orchestration tools, Data Quality & Validation: Ensure data integrity and consistency by implementing validation checks and reconciliation processes, Security & Compliance: Follow best practices in data security, access control, and compliance while handling sensitive data, Collaboration with Teams: Work closely with data engineers, analysts, and product teams to align data processing with business needs, Education Bachelors degree in Computer Science, Data Engineering, or a related field, Long Description 2+ years of experience in ETL development, data pipeline management, or backend data engineering, Proficiency in Python: Strong hands-on experience in writing Python scripts for ETL processes, Linux Expertise: Experience working with Linux servers, command-line operations, and system performance tuning, Cloud Storage Management: Hands-on experience with Amazon S3, including handling file storage, retrieval, and lifecycle policies, Data Pipeline Management: Experience with ETL frameworks, data pipeline automation, and workflow scheduling (e-g , Apache Airflow, Luigi, or Prefect), SQL & Database Handling: Strong SQL skills for data extraction, transformation, and loading into relational databases and data warehouses, Disk Space & Storage Optimization: Ability to manage disk space efficiently, balancing usage across different systems, Error Handling & Debugging: Strong problem-solving skills to troubleshoot ETL failures, debug logs, and resolve data inconsistencies, Experience with cloud data warehouses (e-g , Snowflake, Redshift, BigQuery), Knowledge of message queues (Kafka, RabbitMQ) for data streaming, Familiarity with containerization tools (Docker, Kubernetes) for deployment, Exposure to infrastructure automation tools (Terraform, Ansible), Knowledge, Ability & Skills Strong analytical mindset and ability to handle large-scale data processing efficiently, Ability to work independently in a fast-paced, product-driven environment,

Posted 2 months ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Job title: Senior Analyst Business Analytics Location: Hyderabad % of travel expected: Travel required as per business need Job type: Permanent and Full time About The Job Go-To-Market Capabilities (GTMC) Hub is an internal Sanofi resource organization based in Hyderabad, India and is set up to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions GTMC strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi globally, At our Sanofi we are leveraging analytics and technology, on behalf of patients around the world We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the worlds greatest health threats Within our Insights & Analytics organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management, Deliverables support planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial In addition to ensuring high-quality deliverables, our team drives synergies across the franchise, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes, We are looking for a Senior Analyst to support our analytics and reporting team Robust analytics and reporting is a priority for our businesses, as the product potential has major implications to a wide range of disciplines It is essential to have someone who understands and aspires to implement innovative analytics techniques to drive our insights generation across GBU, Job Summary The Sr Analyst will be responsible for leading analytics and measurement in support of Direct to Consumer & Direct to Patient (DTC/P) marketing campaigns, across brands in Sanofis US Specialty Care, General Medicines, and Vaccines portfolios The ideal candidate has a keen ability of translating disparate data into insights, a deep understanding of customer journey and the impact of media at each stage of the funnel, and experience with designing and executing analytics plans for DTC/P campaigns This person will be an integral partner to the US Go to Market Capabilities (GTMC) Marketing Sciences and Omnichannel teams as well as the US Commercial teams (DTC Marketing) They will also partner and collaborate closely with various Digital teams on key cross-functional projects such as Consumer Omnichannel Data Hub, and DTC/P Enablement via Data Cleanroom technology (i-e Liveramp), Main Responsibilities The overall purpose and main responsibilities are listed below: Stakeholder Management: Maintain effective relationship with stakeholders (Commercial Brand + GTMC Omnichannel & Marketing Sciences teams) within allocated GBU and tasks, Communicate analysis results and recommendations to business partners, (Manager level) Actively lead and develop Consumer Media Analysts and ensure innovative technologies are leveraged, Work collaboratively with the stakeholder teams to prioritize work and deliver on time-sensitive requests, Collaborate with stakeholders for project planning and setting up the timelines and maintaining budget, Data & Technology Collaborate with Data Management & Digital teams to integrate and unify first-party, second-party and third-party media data sets from disparate sources to create a 360 view of consumer interactions across channels at various stages of consumer marketing funnel, Contribute to the centralization of data across consumer media channels, addressing data inconsistencies and quality issues, Inform media tagging & taxonomy governance efforts to standardize tags, categories, values, and deployment with an aim to enhance data organization and downstream campaign analytics, Measurement: Lead development and maintenance of KPI reports for performance by DTC channels (Linear TV, Streaming, Display OLV, Social, Search, Web, Email) to enable business decisions Insights should be provided at various levels of granularity (national, by geo, brand/indication, audience segment, channel, partner, etc ) Develop multi-channel attribution models to understand effectiveness of marketing channels on driving awareness, conversion, and adherence among consumers, Engage with agency partners to ensure insights are based on reliable, high-quality information and weave in context surrounding media execution strategy, Act as strategic partner to GTMC Omnichannel team in understanding channel performance, assessing new tactics, and pushing forward omnichannel initiatives, Deliver DTC channel optimization recommendations using insights & analytics, in partnership with Advanced Analytics, Market Research, Brand Analytics and/or Digital, Pioneer innovative analytic approaches for midand long-term impact measurement while providing guidance on short term analytics, Maintains strong pulse on industry digital and analytic trends in DTC/P marketing, Process: Contribute to overall quality enhancement by ensuring high standards for outputs produced by team, Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards, Use state-of-the-art methodologies and partner with internal teams to continuously improve data quality and availability by building business processes that support global standardization, Ability to work cross-functionally, gather requirements, analyze data, and generate insights and reports that can be used by the GBU, Technical skills Data Analysis & Tools Proficient in data analysis tools such as Excel, SQL, Python, and R Familiar with data visualization tools like Power BI and Tableau Statistical & Predictive Modeling Expert in advanced statistical techniques and marketing analytics, including hypothesis testing, regression analysis, time series analysis, clustering, classification, campaign effectiveness and ROI analysis, customer journey mapping, customer segmentation strategies, factor analysis and A/B testing, Experience building models to inform Next Best Channel and Next Best Content to ensure the most effective day, content, and channel to engage with consumers based on their stage in the patient journey, Skilled in applying machine learning algorithms and data-driven approaches to uncover actionable insights, optimize marketing performance, and drive personalized customer experiences across multiple channels, Database & Data Management Proficient in working with modern data warehouses (particularly Snowflake) and cloud-based data platforms; strong understanding of ETL processes and experience in handling large-scale datasets efficiently, Skilled in SQL for complex data querying and manipulation; experienced in version control using GitHub for collaborative coding and managing data pipelines, Familiarity with data governance, quality management, and security compliance (e-g , HIPAA, GDPR) in a pharmaceutical context; knowledge of data integration techniques for consolidating diverse healthcare and marketing data, Marketing Analytics knowledge Comprehensive understanding of key performance metrics across diverse marketing channels, including TV, SEO, SEM, social media, display advertising, email marketing, and point-of-care (POC) initiatives Proficient in analyzing Consumer/Patient engagement across multiple channels (e-g , email, web, mobile, field force), with the ability to derive actionable insights for optimizing omnichannel strategies in the pharmaceutical industry, Strong grasp of media channel dynamics, including paid, owned, and earned media, with expertise in cross-channel attribution and optimization techniques specific to healthcare professional targeting, Experience 3-6 years of experience in US Consumer Media analytics (ideally in the US pharma domain) In-depth knowledge of common commercial pharmaceutical datasets (e-g IQVIA Xponent and APLD, etc ) as well as understanding of Direct to Consumer/Patient media channels such as Linear & Streaming TV, social, online video, websites, mobile applications, e-mail, programmatic & endemic display, paid search vs SEO Preferred hands-on experience with digital marketing platforms (e-g Google analytics, Facebook Ads Manager, Campaign manager, Google Ads) and Liveramp Cleanroom, Education Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e-g , PhD / MBA / Masters) Soft Skills Problem Solving Skills Ability to quickly grasp complex business problems and provide data driven solution, Ability to break down complex problems into smaller, manageable steps, Communications and Collaboration Skilled in translating complex data finding into actionable insights for non-technical teams, Proven ability to work effectively across all levels of stakeholders and diverse functions, Leaning & Adaptability Eager to learn new tools and technologies and able to adapt quickly to changing environment, A curious and inquisitive mindset to explore data and discover new insights, Time Management and Prioritization Capable of managing time effectively in a faced paced, multi-tasking environment and delivery high-quality analysis on time, Excellent planning, design, project management and documentation skills, Languages Excellent English communication skills written and spoken Why chose us Bring the miracles of science to life alongside a supportive, future-focused team, Discover endless opportunities to grow your talent and drive your career, whether its through a promotion or lateral move, at home or internationally, Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact, Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeksgender-neutral parental leave, Play an instrumental part in creating best practice within our Go-to-Market Capabilities, null Show more Show less

Posted 2 months ago

Apply

4.0 - 5.0 years

8 - 9 Lacs

Hyderabad

Work from Office

Role & responsibilities Role : Third Party Data Acquisition Study Analyst * Collaborate closely with the Third Party Data Acquisition Study Analyst to facilitate the setup and acquisition of external clinical data at the study level during study startup, conduct, and closeout phases. * Ensure data accuracy, completeness, and consistency through effective cleaning, validation, and transformation processes. * Monitor data feeds regularly to uphold data quality and accuracy. * Support the setup of infrastructure enabling external data integration into client's clinical data pipelines. * Provide guidance and coordination to vendors on data acquisition systems, including GlobalScape, Veeva FTP, Veeva training, and IP whitelisting. * Review incoming data from multiple sources to confirm compliance with Data Transfer Agreements (DTA) regarding format, file extensions, and field requirements. Validate all third-party data generated during clinical trials for integration into Clients clinical data ecosystem. * Assist the Third Party Data Acquisition Study Analyst in conducting periodic Trial Master File (TMF) reviews per the study TMF plan. * Adhere to procedural documents and participate in their review and updates to ensure alignment with industry standards, regulatory requirements, and best practices. * Coordinate with and review deliverables from external partners (e.g., labs, eCOA providers, technology vendors) performing services for Client. * Appropriately escalate issues to the Third Party Data Acquisition Study Lead. * Routinely monitor proprietary applications for scanned mail and distribute documents to relevant teams.Perform regular quality checks to ensure optimal system performance and data integrity. Preferred candidate profile

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Title:: ======== SAP BO Senior Consultant x3 positions Job Location: (Multiple) ========= Dubai - UAE Riyadh - Saudi Doha - Qatar Type of Job: ========== Work from office Salary per Month: =============== USD $1500 - $4000 Full Tax Free USA Dollars - Depending on Experience Project duration: 1-2 Years Experience needed: 5 Years or above Qualification: BTech/MTech/MCA/MSc IT or any equivalent Responsibilities: Work closely with business users to understand their needs and translate them into effective reporting and analytics solutions using SAP BusinessObjects tools (Web Intelligence, Crystal Reports, SAP Lumira, SAP Analytics Cloud, SAP Dashboards, SAP BusinessObjects Explorer). Should manage all phases of SAP BusinessObjects implementation projects, from requirements gathering and solution design to development, testing, and deployment. Design, develop, and deploy complex reports, dashboards, and visualizations that effectively communicate key business insights and trends to stakeholders. Troubleshoot technical issues within the SAP BusinessObjects environment, including performance tuning, system configuration, and resolving connectivity problems. Ensure data accuracy and integrity by integrating SAP BusinessObjects with various data sources, including SAP BW, SAP HANA, relational databases, and non-SAP systems. Develop and deliver training programs to educate users on SAP BusinessObjects functionality and empower them to leverage the platform for self-service analytics. Qualifications: Should have 5-10 years of experience working with SAP BusinessObjects. Proven experience in leading and managing SAP BusinessObjects implementation projects. Strong proficiency in developing reports, dashboards, and visualizations using SAP BusinessObjects tools. In-depth knowledge of business intelligence concepts, data warehousing principles, data modeling, ETL processes, and dimensional modeling. Strong SQL skills for data manipulation and query optimization. SAP certifications in BusinessObjects (e.g., "SAP Certified Application Associate - BusinessObjects") is a added advantage. No.of positions: 3 Nice to have: =========== Any onsite experience is added plus Any SAP Certifications are added advantage Business Verticals: ==================== Oil and Gas Petro Chemicals Industries Banking and Financial services Capital Markets Telecom Automotive Healthcare Logistics / Supply Chain Job Ref Code: SAP_BO_0525 Email: ===== spectrumconsulting1985@gmail.com If you are interested, Please email your PDF / Word - CV by quoting job ref. code [ SAP_BO_0525 ] as subject

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Jaipur

Hybrid

Roles and Responsibilities Role - Argus and other data products are growing and we need to ensure data accuracy across MS SQL, SSIS, and Snowflake platforms. This role is important for maintaining reliable data quality and reports. Responsibilities Develop and implement automated testing strategies to ensure the quality and accuracy of data across MS SQL, SSIS, and Snowflake platforms. Collaborate with Data and Analytics developers and stakeholders to understand data requirements and design comprehensive test plans. Execute test cases, analyze results, and identify and report defects to ensure timely resolution. Establish testing processes and procedures, including test data management and version control, to maintain consistency and reliability. Monitor and evaluate the performance of Products and processes, recommending improvements to enhance data quality and efficiency. Stay updated on emerging trends and best practices in data testing and automation, incorporating new technologies and methodologies as appropriate. Ensure to work in compliance with Hydro Quality system, HSE regulations, policies and standardized operating procedures. Perform all other tasks upon the instructions of the superior in charge which may be necessitated by the operations of the related unit and which do not conflict with any applicable laws, statutory provisions and company rules. Compliance with area-specific customer requirements Required Qualification and Skills 5+ years of experience as a Manual/Automated Tester or Quality Assurance Analyst in a Business Intelligence environment. Proven track record of designing and executing automated test scripts using industry-standard tools and frameworks (e.g., Selenium, JUnit, TestNG). Experience in testing data pipelines, ETL processes, and data warehousing solutions across multiple platforms such as MS SQL Server, SSIS, and Snowflake. Strong analytical skills with the ability to identify data anomalies and discrepancies and troubleshoot issues effectively. Demonstrated ability to work collaboratively with cross-functional teams and communicate effectively with technical and non-technical stakeholders. Experience in Scrum/Agile methodology. Experience in manufacturing domain is a plus BE/B.Tech or MCA or Bachelor’s degree in computer science, Information Systems, Business Administration, or a related field. Fluent English Demonstrated capability to solve complex analytical problems through the internalization of domain knowledge and the application of technical expertise. Excellent communication and interpersonal skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to work independently, manage multiple projects simultaneously, and deliver high- quality results within tight deadlines. What we offer you Working at the world’s only fully integrated aluminum and leading renewable energy company Diverse, global teams Flexible work environment/home office We provide you the freedom to be creative and to learn from experts Possibility to grow with the company, gain new certificates Attractive benefit package

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Title:: ======== Senior SAP BI Consultant x3 Job Location: (Multiple) ========= Dubai - UAE Riyadh - Saudi Doha - Qatar Type of Job: ========== Work from office Salary per Month: =============== USD $1500 - $4000 Full Tax Free USA Dollars - Depending on Experience Project duration: 1-2 Years Experience needed: 5 Years or above Qualification: BTech/MTech/MCA/MSc IT or any equivalent Responsibilities Design and develop complex data warehouses, universes, and dashboards using SAP BI tools (BW/4HANA, Lumira, BOBJ). Develop and implement ETL (Extract, Transform, Load) processes to ensure accurate data integration from various sources. Lead and manage the full lifecycle of SAP BI projects, from requirements gathering to deployment and ongoing support. Partner with business stakeholders to understand their data needs and translate them into effective BI solutions. Utilize data modeling techniques to ensure data integrity, consistency, and efficient retrieval. Create insightful reports and visualizations to communicate complex data and trends in a clear and concise manner. Qualifications 5+ years of experience in SAP BI implementation and configuration, with a proven track record of successful project delivery. Strong understanding of data modeling techniques, ETL processes, and data governance principles. In-depth knowledge of SAP BI tools (BW/4HANA, Lumira, BOBJ) and data warehousing concepts. Experience with SQL scripting and data manipulation techniques. Experience with cloud-based BI platforms (e.g., SAP Analytics Cloud) is a plus. Strong SAP BI certification (C_BW7.200, C_BOBJ_BR15) is a plus. No.of positions: 3 Nice to have: =========== Any onsite experience is added plus Any SAP Certifications are added advantage Business Verticals: ==================== Oil and Gas Petro Chemicals Industries Banking and Financial services Capital Markets Telecom Automotive Healthcare Logistics / Supply Chain Job Ref Code: SAP_BI_0525 Email: ===== spectrumconsulting1985@gmail.com If you are interested, Please email your PDF / Word - CV by quoting job ref. code [ SAP_BI_0525 ] as subject

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bangalore/ Bengaluru

Work from Office

Job Title: Oracle SCM Developer/Consultant x3 Job Location: (Multiple) =========== Dubai - UAE Doha - Qatar Kuwait city - Kuwait Riyadh - Saudi Arabia Muscat - Oman Email: spectrumconsulting1997@gmail.com Salary Per Month: 10k to 15k AED per month Full Tax Free Salary Project Duration: 2 Years Desired Experience: 5 - 10 Years Work Permit & visa - will be sponsored by the company Qualification: B.E/B.Tech/ MCA/ M.Tech / MSc IT or any equivalent Key Responsibilities: Implement, configure, and support Oracle E-Business Suite (EBS) SCM modules such as Inventory, Purchasing, Order Management, Advanced Pricing, and WMS. Develop and maintain PL/SQL stored procedures, functions, packages, and triggers. Write and optimize complex SQL queries for data manipulation and reporting. Design and implement data migration strategies and ETL processes. Develop and customize reports using Oracle BI Publisher, Oracle Reports, and Oracle Analytics Cloud. Customize and extend Oracle Forms and Reports, OAF, and ADF applications. Design and implement automated workflows using Oracle Workflow Builder and AME. Collaborate with clients to gather requirements, conduct gap analysis, and translate business needs into technical solutions. Perform root cause analysis and resolve complex technical issues. Participate in Oracle SCM implementation projects, including requirements gathering, configuration, testing, and deployment. Provide training and support to end-users and clients. Required Skills and Qualifications: 5-10 years of experience as an Oracle SCM Consultant/Developer. In-depth knowledge of Oracle EBS SCM modules and Oracle Fusion Cloud SCM (optional but beneficial). Proficiency in PL/SQL programming and SQL query optimization. Experience with Oracle Integration Cloud (OIC) and web services integration (REST/SOAP). Strong understanding of data migration strategies and ETL processes. Proficiency in Oracle BI Publisher, Oracle Reports, and Oracle Analytics Cloud. Experience with Oracle Forms and Reports customization, OAF, and ADF. Knowledge of Oracle Workflow Builder and AME. Hands-on experience with Oracle SCM implementation and upgrade projects. Functional Vertical: - Any Financial Services (Banking/Insurance or related) - Telecom /Healthcare/ Retail - Logistics /Utilities/Energy Sector (Oil and Gas/Power) Nice to have: - Any oracle certifications are added advantage - Any Onsite experience is added advantage No. of. positions: 3 Benefits: - Onsite Work Permit + visa + Insurance + air ticket will be sponsored by the company - Long term (2 Years) Project Job Ref code:- ORA_SCM_0525 Email: spectrumconsulting1997@gmail.com If you are interested, please email or WhatsApp your CV as ATTACHMENT with job ref. code [ ORA_SCM_0525 ] as subject

Posted 2 months ago

Apply

5.0 - 8.0 years

7 - 13 Lacs

Kolkata

Work from Office

Job Summary: We are seeking a skilled and motivated Data Engineer with 3-5 years of experience to join our growing data team. The ideal candidate will be responsible for designing, developing, testing, deploying, and maintaining robust, scalable, and efficient data pipelines and infrastructure. You will work closely with data scientists, analysts, software engineers, and business stakeholders to understand data requirements and deliver high-quality data solutions that drive business insights and decisions. Key Responsibilities: Design, build, and maintain scalable and reliable ETL/ELT data pipelines to ingest, transform, and load data from diverse sources (e.g., relational databases, APIs, streaming platforms, flat files). Develop and manage data warehousing solutions, ensuring data integrity, optimal performance, and cost-effectiveness. Implement data models, data schemas, and data dictionaries to support business and analytical requirements. Ensure data quality, consistency, and accuracy across all data systems by implementing data validation, cleansing, and monitoring processes. Optimize data pipeline performance and troubleshoot data-related issues. Collaborate with data scientists and analysts to provide them with clean, well-structured, and readily accessible data for their analysis and modelling needs. Implement and maintain data security and governance best practices. Automate data processes and workflows using scripting and orchestration tools. Document data pipelines, architectures, and processes. Stay up to date with emerging data technologies and best practices, and recommend improvements to our existing data stack. Required Skills & Qualifications: Bachelors or master’s degree in computer science, Engineering, Information Systems, or a related technical field. 5-8 years of hands-on experience in a Data Engineering role. Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Proficiency in Python. Experience with building and optimizing data pipelines using ETL/ELT tools and frameworks (e.g., Apache Airflow, dbt, Informatica, Talend, custom scripts). Hands-on experience with big data technologies (e.g., Apache Spark, Hadoop ecosystem - HDFS, MapReduce, Hive). Experience with cloud platforms (e.g., Azure - ADLS, Databricks, Synapse; GCP - GCS, BigQuery, Dataflow). Understanding of data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, BigQuery, Synapse Analytics). Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. Experience with version control systems (e.g., Git). Strong analytical and problem-solving skills. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Ability to manage multiple tasks and projects simultaneously. Preferred/Bonus Skills: Experience with real-time data streaming technologies (e.g., Apache Kafka, Kinesis, Flink, Spark Streaming). Knowledge of containerization and orchestration (e.g., Docker, Kubernetes). Familiarity with CI/CD pipelines for data engineering

Posted 2 months ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .

Posted 2 months ago

Apply

5 - 9 years

7 - 11 Lacs

Kochi, Coimbatore, Thiruvananthapuram

Work from Office

Job Title - Senior Data Engineer (Graph DB specialist)+ Specialist + Global Song Management Level :9,Specialist Location:Kochi, Coimbatore Must have skills: Data Modeling Techniques and Methodologies Good to have skills:Proficiency in Python and PySpark programming. Job Summary :We are seeking a highly skilled Data Engineer with expertise in graph databases to join our dynamic team. The ideal candidate will have a strong background in data engineering, graph querying languages, and data modeling, with a keen interest in leveraging cutting-edge technologies like vector databases and LLMs to drive functional objectives. Your responsibilities will include: Design, implement, and maintain ETL pipelines to prepare data for graph-based structures. Develop and optimize graph database solutions using querying languages such as Cypher, SPARQL, or GQL. Neo4J DB experience is preferred. Build and maintain ontologies and knowledge graphs, ensuring efficient and scalable data modeling. Integrate vector databases and implement similarity search techniques, with a focus on Retrieval-Augmented Generation (RAG) methodologies and GraphRAG. Collaborate with data scientists and engineers to operationalize machine learning models and integrate with graph databases. Work with Large Language Models (LLMs) to achieve functional and business objectives. Ensure data quality, integrity, and security while delivering robust and scalable solutions. Communicate effectively with stakeholders to understand business requirements and deliver solutions that meet objectives. Roles & Responsibilities: Experience:At least 5 years of hands-on experience in data engineering. With 2 years of experience working with Graph DB. Programming: Querying:Advanced knowledge of Cypher, SPARQL, or GQL querying languages. ETL Processes:Expertise in designing and optimizing ETL processes for graph structures. Data Modeling:Strong skills in creating ontologies and knowledge graphs.Presenting data for Graph RAG based solutions Vector Databases:Understanding of similarity search techniques and RAG implementations. LLMs:Experience working with Large Language Models for functional objectives. Communication:Excellent verbal and written communication skills. Cloud Platforms:Experience with Azure analytics platforms, including Function Apps, Logic Apps, and Azure Data Lake Storage (ADLS). Graph Analytics:Familiarity with graph algorithms and analytics. Agile Methodology:Hands-on experience working in Agile teams and processes. Machine Learning:Understanding of machine learning models and their implementation. Professional & Technical Skills: Additional Information: (do not remove the hyperlink) Qualifications Experience: Minimum 5-10 year(s) of experience is required Educational Qualification: Any graduation / BE / B Tech

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members, analyzing requirements, and developing solutions to meet business needs. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior team members Conduct regular code reviews to ensure quality standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience in data modeling and database design Knowledge of SAP BusinessObjects reporting tools Hands-on experience in troubleshooting and debugging applications Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Hyderabad office A 15 years full-time education is required Qualifications 15 years full time education

Posted 2 months ago

Apply

7 - 12 years

9 - 14 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Pune. You will play a crucial role in the development and implementation of software solutions. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct code reviews and ensure coding standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica PowerCenter Strong understanding of ETL concepts Experience in data warehousing and data modeling Hands-on experience in performance tuning and optimization Knowledge of SQL and relational databases Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica PowerCenter This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : No Function Specialty Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the smooth functioning of applications and their alignment with business needs. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications based on business requirements. Troubleshoot and debug applications to ensure their smooth functioning. Ensure the security and integrity of applications by implementing appropriate measures. Document application design, development, and maintenance processes. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Strong understanding of data integration and ETL concepts. Experience in designing and developing Ab Initio graphs and plans. Knowledge of database concepts and SQL. Experience with version control systems such as Git. Good To Have Skills:Experience with data warehousing concepts. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Pune office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

8 - 13 years

11 - 17 Lacs

Gurugram

Work from Office

SUMMARY Job Role: ETL Testing With PySpark Professionals Location: Gurgaon Experience: 8+ years Must-Have: 6 years of relevant experience in ETL Testing and PySpark Responsibilities: Over 6-8 years of experience in ETL Testing with Automation Testing Proficiency in database testing using SQL Experience with Databricks and familiarity with Databricks related concepts Verification of data source locations and formats, data count, and validation of columns and data types Testing data accuracy, completeness, and identification of key ETL mapping scenarios Development and execution of test plans, test cases, and test scripts Proficiency in writing complex SQL queries and validation of Enterprise Data Warehouse Applications Understanding of data model, ETL architecture, and Data Warehouse concepts Experience working with Agile Methodology Exposure to PySpark is a plus Requirements Requirements: 6+ years of experience in ETL Testing and PySpark Proficiency in SQL and database testing Familiarity with Databricks and related concepts Strong understanding of ETL architecture and Data Warehouse concepts Experience with Agile Methodology

Posted 2 months ago

Apply

2 - 4 years

10 - 14 Lacs

Pune

Hybrid

So, what’s the role all about? A Java fullstack software developer is responsible for both frontend and backend development using Java-based technologies. Here's an overview of what you might expect in a job description for this role. How will you make an impact? Investigate, measure, and report on client's risk of suspicious or fraudulent financial activity. Follow SOPs as per anti-money laundering laws and carry out investigations. Identify areas for improving alert investigation process. Collaborate with auditors and regulators to minimize money-laundering risks to client’s business. Report and make notes and records of any suspicious transactions or activities in an efficient and timely manner. Proactive work on investigations within SLA and be a strong performer in the team Be well versed with FCC investigator solutions including Actimize (if possible) Work within service levels, KPI’s and in line with the regulatory best practice. Be up to date with trainings conducted for the investigation team, including workshops, conferences, and any certification or refresher training as required. Review risk and complete risk assessments as required. Maintain and update your knowledge of anti-money laundering compliance rules, regulations, laws, and best practices. Take part in and lead anti-money laundering compliance training on identifying suspicious activity to other team members. Indirect/direct consulting to clients. Provide domain expertise support during pre/post service sales process. Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 2+ years of software development experience At least 2+ years of working experience in Core Java, proficient with Java algorithms and data structures Worked in high performance, highly available and scalable systems Strong experience with J2EE, Spring Framework, IOC, annotations Experience in any object-relational mapping (e.g. Hibernate) Strong knowledge of OOAD and Design patterns Development experience building solutions that leverage SQL and NoSQL databases Strong Development experience creating RESTful Web APIs Knowledge of BIG DATA and ETL Concepts (or BI tool like Tableau) will be added advantage Experience designing and developing scalable multi-tenant SaaS-based solutions Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Development experience in Angular Experience working in and driving Continuous Integration and Delivery practices using industry standard tools such as Jenkins Experience working in an Agile methodology development environment and using work item management tools like JIRA Experience with version control tools – GIT, Perforce Ability to work independently and collaboratively, good communication skill Bring a culture of Innovation to the job Ability to work under high pressure High attention to details and accuracy Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Experience working in and driving Continuous Integration and Delivery practices using industry standard tools such as Jenkins. Ability to work independently and collaboratively, good communication skill. Able to resolve problems of moderate scope which requires an analysis based on a review of a variety of factors. You will have an advantage if you also have: Experience in Big data What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next Nicer! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7243 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 2 months ago

Apply

2 - 7 years

6 - 10 Lacs

Bengaluru

Work from Office

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. You"™d describe yourself as: Experience 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills Proficiency in Python, SQL, and other relevant programming languages. Data Modeling Experience with data modeling and database design. Problem-Solving Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability Ability to adapt to changing technologies and work in a fast-paced environment. Team Player Strong team player with a collaborative mindset. Continuous Learning Eagerness to learn and stay updated with the latest trends and technologies in data engineering. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers

Posted 2 months ago

Apply

5 - 10 years

5 - 6 Lacs

Pune

Work from Office

Diverse Lynx is looking for Power BI Developer to join our dynamic team and embark on a rewarding career journey Responsible for designing, developing, and implementing business intelligence solutions using Power BI, a data visualization and reporting tool from Microsoft Connecting to and integrating data from various sources, including databases, spreadsheets, and cloud services Designing and creating data models, dashboards, reports, and other data visualizations Enhancing existing Power BI solutions to meet evolving business requirements Collaborating with stakeholders to understand their data needs and requirements Building and maintaining data pipelines and ETL processes to ensure data quality and accuracy Developing and implementing security and access control measures to ensure the protection of sensitive data Troubleshooting and resolving issues with Power BI solutions Documenting and communicating solutions to stakeholders Excellent communication, analytical, and problem-solving skills

Posted 2 months ago

Apply

10 - 15 years

12 - 16 Lacs

Pune

Work from Office

About The Role The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role and responsibilities: Represent eClerx in client pitches, external forums, and COE (Center of Excellence) activities to promote cloud engineering expertise. Lead research, assessments, and development of best practices to keep our cloud engineering solutions at the forefront of technology. Contribute to the growth of the cloud engineering practice through thought leadership, including the creation of white papers and articles. Lead and collaborate on multi-discipline assessments at client sites to identify new cloud-based opportunities. Provide technical leadership in the design and development of robust, scalable cloud architectures. Drive key cloud engineering projects, ensuring high performance, scalability, and adherence to best practices. Design and implement data architectures that address performance, scalability, and data latency requirements. Lead the development of cloud-based solutions, ensuring they are scalable, robust, and aligned with business needs. Anticipate and mitigate data bottlenecks, proposing strategies to enhance data processing efficiency. Provide mentorship and technical guidance to junior team members. Technical and Functional skills: Bachelor’s with 10+ years of experience in data management and cloud engineering. Proven experience in at least 2-3 large-scale cloud implementations within industries such as Retail, Manufacturing, or Technology. Expertise in Azure Cloud, Azure Data Lake, Databricks, Teradata, and ETL technologies. Strong problem-solving skills with a focus on performance optimization and data quality. Ability to collaborate effectively with analysts, subject matter experts, and external partners. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About the Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law.

Posted 2 months ago

Apply

10 - 15 years

12 - 16 Lacs

Mumbai

Work from Office

About The Role The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role and responsibilities: Represent eClerx in client pitches, external forums, and COE (Center of Excellence) activities to promote cloud engineering expertise. Lead research, assessments, and development of best practices to keep our cloud engineering solutions at the forefront of technology. Contribute to the growth of the cloud engineering practice through thought leadership, including the creation of white papers and articles. Lead and collaborate on multi-discipline assessments at client sites to identify new cloud-based opportunities. Provide technical leadership in the design and development of robust, scalable cloud architectures. Drive key cloud engineering projects, ensuring high performance, scalability, and adherence to best practices. Design and implement data architectures that address performance, scalability, and data latency requirements. Lead the development of cloud-based solutions, ensuring they are scalable, robust, and aligned with business needs. Anticipate and mitigate data bottlenecks, proposing strategies to enhance data processing efficiency. Provide mentorship and technical guidance to junior team members. Technical and Functional skills: Bachelor’s with 10+ years of experience in data management and cloud engineering. Proven experience in at least 2-3 large-scale cloud implementations within industries such as Retail, Manufacturing, or Technology. Expertise in Azure Cloud, Azure Data Lake, Databricks, Teradata, and ETL technologies. Strong problem-solving skills with a focus on performance optimization and data quality. Ability to collaborate effectively with analysts, subject matter experts, and external partners. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About the Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law.

Posted 2 months ago

Apply

2 - 5 years

3 - 7 Lacs

Gurugram

Work from Office

Role Data Engineer Skills: Data Modeling:* Design and implement efficient data models, ensuring data accuracy and optimal performance. ETL Development:* Develop, maintain, and optimize ETL processes to extract, transform, and load data from various sources into our data warehouse. SQL Expertise:* Write complex SQL queries to extract, manipulate, and analyze data as needed. Python Development:* Develop and maintain Python scripts and applications to support data processing and automation. AWS Expertise:* Leverage your deep knowledge of AWS services, such as S3, Redshift, Glue, EMR, and Athena, to build and maintain data pipelines and infrastructure. Infrastructure as Code (IaC):* Experience with tools like Terraform or CloudFormation to automate the provisioning and management of AWS resources is a plus. Big Data Processing:* Knowledge of PySpark for big data processing and analysis is desirable. Source Code Management:* Utilize Git and GitHub for version control and collaboration on data engineering projects. Performance Optimization:* Identify and implement optimizations for data processing pipelines to enhance efficiency and reduce costs. Data Quality:* Implement data quality checks and validation procedures to maintain data integrity. Collaboration:* Work closely with data scientists, analysts, and other teams to understand data requirements and deliver high-quality data solutions. Documentation:* Maintain comprehensive documentation for all data engineering processes and projects.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies