Jobs
Interviews

1055 Etl Processes Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

25 - 30 Lacs

Bengaluru, Karnataka, India

On-site

Description We are seeking a highly skilled Oracle MDM - PDH professional to join our team in India. The ideal candidate will have extensive experience in implementing and managing Oracle Master Data Management solutions, specifically in Product Data Hub, to ensure data integrity and governance across the organization. Responsibilities Design, implement, and maintain Oracle Master Data Management (MDM) solutions. Collaborate with cross-functional teams to gather and analyze data requirements. Develop and execute data quality and governance strategies. Support data integration and migration processes. Troubleshoot and resolve data-related issues in a timely manner. Provide training and support to end-users on MDM applications. Skills and Qualifications 7-10 years of experience in Oracle MDM, preferably with Product Data Hub (PDH) expertise. Strong understanding of master data management principles and practices. Proficiency in SQL and PL/SQL for data manipulation and analysis. Experience with data modeling and data governance frameworks. Knowledge of data integration tools and techniques. Familiarity with Oracle Cloud applications is a plus. Excellent analytical and problem-solving skills. Strong communication and teamwork abilities.

Posted 1 month ago

Apply

8.0 - 10.0 years

4 - 14 Lacs

Pune, Maharashtra, India

On-site

Description We are seeking an experienced Oracle PostgreSQL Developer to join our dynamic team in India. The ideal candidate will have extensive experience in developing and managing PostgreSQL databases, ensuring optimal performance and reliability to support our business applications. Responsibilities Design, develop and maintain PostgreSQL database systems. Optimize database performance through indexing, query optimization, and schema design. Work with application developers to integrate PostgreSQL with various applications. Implement backup and recovery strategies for PostgreSQL databases. Monitor database health and performance, ensuring high availability and reliability. Write complex SQL queries and stored procedures to support business requirements. Perform database upgrades and migrations with minimal downtime. Ensure data security and compliance with data protection regulations. Skills and Qualifications 8-10 years of experience in PostgreSQL database development and administration. Strong knowledge of SQL and PL/pgSQL. Experience with database performance tuning and optimization techniques. Proficient in database backup and recovery strategies. Hands-on experience with data modeling and schema design. Familiarity with database migration and upgrade processes. Understanding of data security best practices. Experience with Linux/Unix environments and shell scripting. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 1 month ago

Apply

4.0 - 8.0 years

25 - 27 Lacs

Mumbai, Hyderabad, Pune

Work from Office

As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to Conduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of ODI partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand source applications and how it can be integrated analyze data sets to understand functional and business context create Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status and risks to key stakeholders Lead the team to design, build, test and deploy Support client needs by delivering ODI jobs and frameworks Merge, Customize and Deploy ODI data model as per client business requirements Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients Focus on designing, building, and documenting re-usable code artifacts Track, report and optimize ODI jobs performance to meet client SLA Designing and architecting ODI projects including upgrade/migrations to cloud Design and implement security in ODI Identify risks and suggest mitigation plan Ability to lead the team and mentor junior practitioners Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts Perform system analysis, follow technical design and work on development activities Participate in design meetings, daily standups, backlog grooming Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans Validate the environment to meets all security and compliance controls Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions Ideally, you should also have Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. 6+ years ETL Lead / developer experience and a minimum of 3-4 Years experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing clients workloads and technology landscape for Cloud suitability Experience in defining new architectures and ability to drive project from architecture standpoint Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills Strong communicator Willingness to travel in case of project requirement Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Location: Mumbai/ Hyderabad/ Bangalore/ Delhi/ Pune/ Kolkata/ Chennai

Posted 1 month ago

Apply

3.0 - 7.0 years

4 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Design and implement data migration strategies using Syniti tools (DSP, ADM, Collect, Construct, etc. ). Analyze legacy data and map it to target ERP systems. Develop and execute ETL processes for data extraction, transformation, and loading. Create and maintain data load programs and migration templates. Perform data profiling, validation, and reconciliation to ensure data quality. Collaborate with business stakeholders and technical teams to resolve data issues. Document migration processes and maintain compliance with data governance standards. Mentor junior consultants and contribute to knowledge sharing. Required Skills & Qualifications: Bachelor s degree in Computer Science, Information Systems, or related field. 4 7 years of experience in data migration or ETL projects. Strong hands-on experience with Syniti Data Migration tools. Proficiency in SQL, stored procedures, and database management. Experience with SAP or other ERP systems is a plus. Strong analytical, problem-solving, and communication skills. Ability to work independently and in global teams. At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is availablehere .

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

We are looking for a skilled and experienced SAP SAC (SAP Analytics Cloud) Senior Consultant to join our team. As a Senior Consultant, you will utilize your expertise in SAP Analytics Cloud to facilitate data-driven decision-making for our clients. Your role will involve implementing analytics solutions that convert intricate data into actionable insights to enhance business performance. Your primary tasks will revolve around designing, developing, and deploying SAP SAC analytics applications, crafting dashboards and reports, and collaborating with clients to comprehend their analytical requirements. You will play a crucial role in steering projects from inception to completion, ensuring top-notch deliverables and seamless user experiences. Key Responsibilities: - Lead the implementation of SAP SAC solutions, encompassing planning, visualization, and predictive analytics - Work closely with clients to collect requirements and translate them into technical specifications for analytics solutions - Create interactive dashboards, stories, and data visualizations that align with client needs - Optimize data models and data preparation processes to guarantee precise and punctual reporting - Offer training and assistance to clients on the efficient utilization of SAP Analytics Cloud - Stay abreast of SAP SAC developments and best practices, integrating them into client solutions - Mentor junior consultants on analytics methodologies and SAP SAC platform initiatives Required Qualifications: - Bachelor's degree in Computer Science, Data Analytics, Business Intelligence, or a related field - Minimum of 5+ years of experience with SAP Analytics Cloud or related analytics tools, focusing on dashboard development - Sound grasp of data visualization principles and best practices - Proficiency in data integration tools and ETL processes - Exceptional analytical and problem-solving abilities, with the capacity to interpret complex data sets - Strong communication and interpersonal skills to effectively engage with clients and stakeholders Join us at Talworx, an emerging recruitment consulting startup, as we are hiring for our client - a leading big 4 company. Our client is a British multinational professional services network headquartered in London, England. Renowned as the largest professional services network globally by revenue and number of professionals, it is recognized as one of the Big Four accounting firms.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The ideal candidate for this role will be responsible for Business Process Analysis, which involves designing and maintaining analytics frameworks for business processes and financial workflows. You will develop dashboards and reports to provide real-time insights into key business metrics. Analyzing user interaction patterns with AI features to enhance adoption and effectiveness will be a crucial part of your role. Additionally, you will create and manage ETL pipelines to ensure data quality and accessibility, as well as generate actionable insights from complex financial datasets to inform product strategy. In terms of AI Platform Analysis, you will be tasked with developing and implementing frameworks to evaluate the performance of LLM-based systems and AI agents. This includes analyzing AI model outputs, response quality, and automation effectiveness. You will also track and report on key AI performance metrics such as accuracy, latency, and automation rates. Collaboration with ML engineers to identify areas for model improvements based on data insights and creating dashboards for monitoring AI system performance and reliability will be part of your responsibilities. A/B testing of different AI configurations and prompt strategies will also be conducted. Regarding the Strategic Impact aspect of the role, you will provide data-driven recommendations for enhancing both AI capabilities and business processes. Partnering with product teams to define and track success metrics for AI features and collaborating with customer success teams to analyze user feedback and AI system effectiveness will be essential. Building and maintaining documentation for AI performance metrics and analytics processes is another key responsibility. **Required Qualifications:** - 5+ years of experience in data analysis, with recent exposure to AI/ML systems - Proficiency in SQL and Python for data manipulation and analysis - Experience analyzing AI/ML system performance and metrics - Expertise in business intelligence tools (e.g., Tableau, Power BI, Looker) - Familiarity with large-scale financial datasets and ETL processes - Strong statistical analysis skills and experience with A/B testing - Experience with cloud-based data warehouses, preferably Snowflake - Bachelor's degree in Statistics, Mathematics, Computer Science, or a related field **Preferred Qualifications:** - Experience analyzing LLM-based systems and generative AI applications - Knowledge of NLP metrics and evaluation frameworks - Experience with financial systems and processes (AP, AR, Vendor Management) - Familiarity with prompt engineering and LLM performance optimization - Experience with real-time analytics and streaming data - Understanding of data privacy and compliance requirements in finance - Master's degree in a related field **Technical Skills Required:** **Data Analysis & Visualization:** Python (Pandas, NumPy, Matplotlib, Seaborn, Plotly, Dash), SQL, Tableau/Power BI/QuickSight, Excel **AI/ML Analytics:** LLM evaluation metrics, AI monitoring frameworks, Prompt effectiveness analysis **Data Infrastructure:** Snowflake or similar data warehouses, AWS/Azure data services, ETL tools and frameworks, Data modeling **Statistics & Testing:** Statistical analysis, A/B testing, Hypothesis testing, Experimental design This role requires a candidate with expertise in data analysis, AI/ML systems, and a strong technical skill set in various tools and frameworks related to data analysis and AI analytics.,

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 6 Lacs

Navi Mumbai

Work from Office

Job Title: Power BI Developer Location: Airoli Department: IT Experience Required: 25 years Employment Type: Full-Time Required Skills : 1.Proven experience as a Power BI Developer or similar role. 2.Strong proficiency in Power BI Desktop, Power BI Service, DAX, and Power Query (M). 3.Good knowledge of SQL and experience with databases like SQL Server, Azure SQL, or Oracle. 4.Experience in data modeling, ETL processes, and creating dataflows. 5.Familiarity with Azure Data Factory, Synapse, or other Microsoft Azure services. 6.Knowledge of Python or R for data analysis (optional). 7.Understanding of data warehouse concepts and OLAP. 8.Strong analytical thinking and problem-solving skills. 9.Excellent communication and stakeholder management skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Delhi, India

On-site

Job Description Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion it's a place where you can grow, belong and thrive. Your day at NTT DATA The Managed Services Cross Technology Engineer (L3) is responsible for providing a service to clients by proactively identifying and resolving technical incidents and problems. Through pre-emptive service incident and resolution activities, as well as product reviews, operational improvements, operational practices, and quality assurance this role will maintain a high level of service to clients. Their primary objective is to ensure zero missed service level agreement (SLA) conditions. The Managed Services Cross Technology Engineer (L3) is responsible for managing tickets of high complexity, conducts advanced and complicated tasks, and provides resolution to a diverse range of complex problems. This position uses considerable judgment and independent analysis within defined policies and practices. This role applies analytical thinking and deep technical expertise in achieving client outcomes, while coaching and mentoring junior team members across functions. This role focusses across two or more technology domains - Cloud, Security, Networking, Applications and / or Collaboration etc. This role may also contribute to / support on project work as and when required. What you'll be doing Key Roles and Responsibilities: Ensures that assigned infrastructure at the client site is configured, installed, tested, and operational Performs necessary checks, apply monitoring tools and respond to alerts Identifies problems and errors prior to or when it occurs and logs all such incidents in a timely manner with the required level of detail Assists in analysing, assigning, and escalating support calls Investigates third line support calls assigned and identify the root cause of incidents and problems Reports and escalates issues to 3rd party vendors if necessary Provides continuous feedback to clients and affected parties and update all systems and/or portals as prescribed by NTT Proactively identifies opportunities for work optimization including opportunities for automation of work Coaches L2 teams for advance technical troubleshooting and behavioural skills May manage and implement projects within technology domain, delivering effectively and promptly per client agreed upon requirements and timelines May work on implementing and delivering Disaster Recovery functions and tests Knowledge, Skills and Attributes: Ability to communicate and work across different cultures and social groups Ability to plan activities and projects well in advance, and takes into account possible changing circumstances Ability to maintain a positive outlook at work Ability to work well in a pressurized environment Ability to work hard and put in longer hours when it is necessary Ability to apply active listening techniques such as paraphrasing the message to confirm understanding, probing for further relevant information, and refraining from interrupting Ability to adapt to changing circumstances Ability to place clients at the forefront of all interactions, understanding their requirements, and creating a positive client experience throughout the total client journey Academic Qualifications and Certifications: Bachelor's degree or equivalent qualification in IT/Computing (or demonstrated equivalent work experience) Certifications relevant to the services provided (certifications carry additional weightage on a candidate's qualification for the role) Relevant certifications include: CCNP or equivalent certification CCNP in Security or PCNSE certification or Firewall Vendor related certification is good to have along with advance technical certification like CCIE, CISSP VMware certified Professional: Data Centre Virtualization VMware Certified Specialist Cloud Provider VMware Site Recovery Manager: Install, Configure, Manage Microsoft Certified: Azure Architect Expert AWS Certified: Solutions Architect Associate Veeam Certified Engineer (VMCE) Rubrik Certified Systems Administrator Zerto, pure, vxrail Google Cloud Platform (gcp) Oracle Cloud Infrastructure (oci) SAP Certified Technology Associate - OS DB Migration for SAP NetWeaver 7.4 SAP Technology Consultant SAP Certified Technology Associate - SAP HANA 2.0 Oracle Cloud Infrastructure Architect Professional IBM Certified System Administrator - WebSphere Application Server Network Required Experience: Seasoned Managed Services experience handling complex cross technology infrastructure Seasoned experience required in Engineering function within a medium to large ICT organisation Seasoned working knowledge of ITIL processes Seasoned experience working with vendors and/or 3rd parties Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Mumbai, Maharashtra, India

On-site

Job Description Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion it's a place where you can grow, belong and thrive. Your day at NTT DATA The Application Managed Services Engineer (L3) is a seasoned engineering role, responsible for providing a managed service to clients by proactively identifying and resolving Packaged Application technologies such as ERP, Middleware and other business critical software incidents and problems. Through pre-emptive service incident and resolution activities, as well as product reviews, operational improvements, operational practices, and quality assurance this role maintains a high level of service to clients. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and is responsible for managing tickets of high complexity, conducts advanced and complicated tasks, and provides resolution to a diverse range of complex problems. This position uses considerable judgment and independent analysis within defined policies and practices and applies analytical thinking and deep technical expertise in achieving client outcomes, while coaching and mentoring junior team members across functions. The Application Managed Services Engineer (L3) may also contribute to / support on project work as and when required. What you'll be doing Key Responsibilities: Ensures that assigned Packaged Application technologies such as ERP, Middleware and other business critical software in the client's environment site is configured, installed, tested, and operational. Performs necessary checks, apply monitoring tools and respond to alerts. Identifies problems and errors prior to or when it occurs and log all such incidents in a timely manner with the required level of detail. Assists in analyzing, assigning, and escalating support calls. Investigates third line support calls assigned and identify the root cause of incidents and problems. Reports and escalates issues to 3rd party vendors if necessary. Provides onsite technical support to clients and provide field engineering services to clients. Conducts a monthly random review of incidents and service requests, analyze and recommend improvement in quality. Provides continuous feedback to clients and affected parties and update all systems and/or portals as prescribed by the company. Proactively identifies opportunities for work optimization including opportunities for automation of work. May manage and implement projects within technology domain, delivering effectively and promptly per client agreed upon requirements and timelines. May work on implementing and delivering Disaster Recovery functions and tests. Performs any related task as required. Knowledge and Attributes: Ability to communicate and work across different cultures and social groups. Ability to plan activities and projects well in advance, and takes into account possible changing circumstances. Ability to maintain a positive outlook at work. Ability to work well in a pressurized environment. Ability to work hard and put in longer hours when it is necessary. Ability to apply active listening techniques such as paraphrasing the message to confirm understanding, probing for further relevant information, and refraining from interrupting. Ability to adapt to changing circumstances. Ability to place clients at the forefront of all interactions, understanding their requirements, and creating a positive client experience throughout the total client journey. Academic Qualifications and Certifications: Bachelor's degree or equivalent qualification in Information Technology/Computing (or demonstrated equivalent work experience). Certifications relevant to the services provided (certifications carry additional weightage on a candidate's qualification for the role). Relevant certifications such as (but not limited to) - SAP Certified Technology Associate - OS DB Migration for SAP NetWeaver 7.4. SAP Technology Consultant. SAP Certified Technology Associate - SAP HANA 2.0. Oracle Cloud Infrastructure Architect Professional. IBM Certified System Administrator - WebSphere Application Server Network. Required Experience: Seasoned years of work experience. Seasoned experience required in Engineering function within a medium to large ICT organization. Seasoned experience of Managed Services. Excellent working knowledge of ITIL processes. Excellent experience working with vendors and/or 3rd parties. Seasoned experience managing Packaged Application technologies such as ERP, Middleware and other business critical software. Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.

Posted 1 month ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Develop and manage real-time data streaming using Apache Kafka Build and automate data applications with Python and Java Optimize and monitor data infrastructure for performance and availability Collaborate with cross-functional teams to improve data solutions Design and optimize SQL queries , ETL pipelines, and data models Implement and maintain data storage solutions (RDBMS, data lakes, cloud warehouses) Document technical processes and workflows Skills and Qualifications: 3+ years in data engineering or related field Proficient in Python, Java, SQL , and ETL processes Hands-on with Apache Kafka and pipeline automation Strong in database design and query optimization Experience with cloud platforms (AWS, Azure, GCP) is a plus Good problem-solving and collaboration skills

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Salesforce Team Lead will be responsible for leading a team of Salesforce developers and administrators, overseeing the design, development, and deployment of Salesforce solutions. You will need to demonstrate strong leadership skills, technical expertise in Salesforce, and the ability to collaborate with cross-functional teams to deliver high-quality CRM solutions. Your responsibilities will include leading and managing the Salesforce development team, providing guidance, mentorship, and support. You will oversee the design, development, testing, and deployment of Salesforce solutions to ensure they meet business requirements and are delivered on time. Collaborating with stakeholders to gather requirements, design solutions, and develop project plans will be a key part of your role. Additionally, you will need to ensure the quality of Salesforce solutions through code reviews, testing, and adherence to best practices, as well as manage the integration of Salesforce with other systems and applications. Monitoring and maintaining the health of the Salesforce platform, including performance optimization and troubleshooting, will also be part of your responsibilities. It will be essential to stay up-to-date with Salesforce updates, releases, and best practices to ensure the team is leveraging the latest features and capabilities. Providing technical leadership and expertise in Salesforce development, including Apex, Visualforce, Lightning Components, and Salesforce APIs, will be crucial. Driving continuous improvement initiatives to enhance team productivity and the quality of deliverables will also be expected of you. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Salesforce Team Lead, Salesforce Developer, or similar role is required. Strong proficiency in Salesforce development, including Apex, Visualforce, Lightning Components, and Salesforce APIs, is essential. You should also have experience with Salesforce administration, including configuration, customization, and user management, as well as familiarity with Salesforce integration tools and techniques. Excellent leadership, communication, and interpersonal skills are necessary, along with strong problem-solving skills and the ability to work independently and as part of a team. Salesforce certifications (e.g., Salesforce Certified Administrator, Salesforce Certified Platform Developer) are highly desirable. In terms of skills, experience with Agile/Scrum methodologies, knowledge of Salesforce Sales Cloud, Service Cloud, and other Salesforce products, understanding of data migration, data integration, and ETL processes, familiarity with DevOps practices and tools for Salesforce development, and experience with third-party applications and AppExchange products will be beneficial for this role.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Visualization Engineer position at Zoetis India Capability Center (ZICC) in Hyderabad offers a unique opportunity to be part of a team that drives transformative advancements in animal healthcare. As a key member of the pharmaceutical R&D team, you will play a crucial role in creating insightful and interactive visualizations to support decision-making in drug discovery, development, and clinical research. Your responsibilities will include designing and developing a variety of visualizations, from interactive dashboards to static visual representations, to summarize key insights from high-throughput screening and clinical trial data. Collaborating closely with cross-functional teams, you will translate complex scientific data into clear visual narratives tailored to technical and non-technical audiences. In this role, you will also be responsible for maintaining and optimizing visualization tools, ensuring alignment with pharmaceutical R&D standards and compliance requirements. Staying updated on the latest trends in visualization technology, you will apply advanced techniques like 3D molecular visualization and predictive modeling visuals to enhance data representation. Working with various stakeholders such as data scientists, bioinformaticians, and clinical researchers, you will integrate, clean, and structure datasets for visualization purposes. Your role will also involve collaborating with Zoetis Tech & Digital teams to ensure seamless integration of IT solutions and alignment with organizational objectives. To excel in this position, you should have a Bachelor's or Master's degree in Computer Science, Data Science, Bioinformatics, or a related field. Experience in the pharmaceutical or biotech sectors will be a strong advantage. Proficiency in visualization tools such as Tableau, Power BI, and programming languages like Python, R, or JavaScript is essential. Additionally, familiarity with data handling tools, omics and network visualization platforms, and dashboarding tools will be beneficial. Soft skills such as strong storytelling ability, effective communication, collaboration with interdisciplinary teams, and analytical thinking are crucial for success in this role. Travel requirements for this full-time position are minimal, ranging from 0-10%. Join us at Zoetis and be part of our journey to pioneer innovation and drive the future of animal healthcare through impactful data visualization.,

Posted 1 month ago

Apply

3.0 - 6.0 years

3 - 15 Lacs

Hyderabad, Telangana, India

On-site

Catalogue Technical Specialist/Analyst Support the technical integration efforts of Alation to other enterprise systems which could contain critical data lineage or data quality information Provide daily operational support of the integration catalog including active directory role management, maintaining integrations, and capability enhancements. Collaborate with Integration owners to identify, define, and capture integration and other key data inside the integration Catalog Optimize Integration descriptions, keywords, and categories for search and discovery. Resolve issues related to integration catalogue in a timely manner. Maintain the data catalog, ensuring accurate and up-to-date metadata for all data assets. Establish and enforce data quality standards and guidelines across the organization. Conduct regular data quality assessments and audits to identify and resolve data issues. Act as a point of contact for data catalog-related inquiries and provide timely resolutions. Generate reports and dashboards to provide insights into data catalog usage, data quality, and metadata completeness. Monitor and analyze data quality metrics to identify and address issues, anomalies, and discrepancies. Analyze metadata to identify trends, patterns, and areas for improvement. Validate and verify integration information to ensure accuracy, completeness, and reliability. Maintain documentation related to the catalog, lineage data quality tools, including user guides, best practices, and standard operating procedures. Experience: Self-directed individual with 3+ years of experience working in integration, data analytics, or related roles, ideally in a high-growth environment. Experience in integration design and development on IPaaS platforms. Experience in DevOps and CI/CD approach for integration deployment. Hands-on experience with the design, configuration or roll-out of catalog tools such as Alation or similar Experience in integrating with ServiceNow platform for creating CI s for driving change management and incidence management. Proven experience in implementing and managing data lineage, catalog, or other solutions in complex enterprise environments. Experience working with databases, business intelligence (BI) tools, and ETL (Extract, Transform, Load) tools. Strong understanding of data catalogs and their respective capabilities including data dictionaries, business glossaries, business lineage, technical lineage, and data mgmt. workflows Understand multiple system integrations/flow of data, and data schema changes.

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer at Nonrel, you will play a crucial role in designing, building, and maintaining scalable data pipelines. Your primary responsibilities will include data modeling, executing Extract Transform Load (ETL) processes, and creating robust data warehousing solutions. By leveraging your strong skills in Data Engineering and Data Modeling, you will ensure the integrity of data throughout its lifecycle while supporting business decisions through data analytics to drive meaningful insights. Collaboration with cross-functional teams will be a key aspect of your daily routine, as you work closely to understand data requirements and optimize data workflows. Your proficiency in developing and managing Data Warehousing solutions will be essential in transforming businesses into agile, technology-driven powerhouses. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field. Experience in the IT or software industry would be advantageous. Your problem-solving skills, analytical mindset, and ability to work effectively in a team environment will be critical in driving innovation and efficiency through cutting-edge software solutions. If you are passionate about leveraging technology to solve complex challenges, streamline operations, and enhance customer experiences, then this full-time on-site position in Canada, USA at Nonrel is the perfect opportunity for you to shape the future of businesses with innovative digital strategies.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As part of the Reporting and Data Analytics (R&DA) team at EY, you will be responsible for delivering key management and financial reporting across the firm. Your role will involve working within a team of developers to support and build reports using Power BI and SSRS. Reports will be created either from pre-prepared datasets or by generating new datasets through SQL and Power Query processes. You will have the opportunity to collaborate closely with the broader R&DA technical team, responsible for the underlying reporting environment, datasets, and data refresh processes. Additionally, you will work alongside the Product Line teams to interpret customer requirements, refine them into technical specifications, and develop reporting applications. Your key responsibilities will include delivering polished reporting solutions using tools such as Power BI, MS Report Builder, Excel, and Power Platform. You will manage ETL processes in SQL and Power Query, collaborate with various teams to solve problems and manage timelines, and ensure data security aligns with EY standards. Additionally, you will be involved in investigating and resolving reporting and data issues and managing the development cycle. To succeed in this role, you should possess advanced skills in Power BI report development, proficiency in DAX, and knowledge of ETL processes with Power Query. Effective communication with technical and Product Line teams, proficiency in data analysis using SQL, Power BI, and Excel, and the ability to adapt to changing business priorities are essential. Experience with Power Platform tools, project management, Azure DevOps, and Visual Studio would be beneficial but not mandatory. The ideal candidate will have at least 5 years of experience in a finance or technical department, with a high motivation to learn from experienced colleagues. Fluency in English, the ability to work independently, and a structured work approach are key attributes for success in this role. Joining the R&DA team at EY offers a dynamic and global delivery network, providing fulfilling career opportunities across various disciplines. You will have the chance to collaborate on exciting projects, work with well-known brands and technologies, and continuously develop your skills in a diverse and inclusive culture. EY is committed to building a better working world by creating new value for clients, people, society, and the planet. Through data, AI, and advanced technology, EY teams help clients shape the future with confidence and address pressing issues. With services spanning assurance, consulting, tax, strategy, and transactions, EY teams operate globally to provide value in more than 150 countries and territories.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

ghaziabad, uttar pradesh

On-site

You will be joining Recklabs, an Operating arm of technology, Finops, and tools ecosystem dedicated to helping organizations optimize their investments in digital, cloud computing, AI, and more. With a strong emphasis on innovation and fostering customer relationships, Recklabs caters to a wide range of leading enterprises, including 40 of the Fortune 500 and 160 of the Global 2000. As a BI Products Sales professional based in Ghaziabad, your primary responsibility will involve selling business intelligence products and services to clients. This full-time on-site role will require you to develop effective sales strategies, perform market research, and achieve sales targets consistently. To excel in this role, you should possess strong analytical skills and expertise in data analytics. Experience in Data Warehousing and Extract Transform Load (ETL) processes will be beneficial, along with proficiency in data modeling. Your communication and presentation skills should be top-notch, enabling you to effectively convey information to clients. Additionally, the ability to collaborate within a team environment is crucial for success in this position. While not mandatory, previous sales experience in the technology industry would be advantageous. A Bachelor's degree in Business, Marketing, or a related field is preferred to ensure a solid foundation for your responsibilities in this role.,

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You are an experienced Qlik Sense & Qlik Cloud Developer who will be responsible for designing, developing, and implementing business intelligence solutions using Qlik Sense and Qlik Cloud. Your expertise in data visualization, dashboard development, and cloud-based analytics will be crucial in supporting data-driven decision-making. Your key responsibilities will include developing, maintaining, and enhancing Qlik Sense dashboards and Qlik Cloud applications to meet business analytics needs. You will design and implement data models, ETL processes, and data integration solutions from various sources. Optimizing Qlik applications for performance, scalability, and efficiency will also be a significant part of your role. Collaboration with business stakeholders to gather requirements and deliver insightful analytics solutions is essential. Ensuring data accuracy, integrity, and security across Qlik Sense and Qlik Cloud environments is a critical aspect of your job. Troubleshooting and resolving issues related to data connectivity, scripting, and performance tuning will also be part of your responsibilities. Staying updated with the latest Qlik technologies, best practices, and industry trends is required. Providing technical guidance and training to business users on Qlik Sense & Qlik Cloud functionalities is expected. Collaborating with IT and Data Engineering teams to ensure seamless integration with enterprise data systems is also part of your role. To qualify for this position, you should have 5 to 10 years of hands-on experience in Qlik Sense and Qlik Cloud development. Strong expertise in Qlik scripting, expressions, and set analysis is necessary. Experience with data modeling, ETL processes, and data transformation is required. Knowledge of SQL, relational databases, and data warehousing concepts is essential. Experience integrating Qlik Sense/Qlik Cloud with different data sources like SAP, REST APIs, Cloud Storage, etc., is preferred. A strong understanding of Qlik Management Console (QMC) and security configurations is important. Proficiency in performance optimization, data governance, and dashboard usability is expected. Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud is a plus. You should be able to work independently and collaboratively in a fast-paced environment. Excellent communication and problem-solving skills are necessary for this role. This is a full-time position with the option to work from either Coimbatore or remotely. Interested candidates can send their resumes to fazilahamed.r@forartech.in or contact +91-7305020181. We are excited to meet you and explore the potential of having you as a valuable member of our team. Benefits include commuter assistance, flexible schedule, health insurance, leave encashment, provident fund, and the opportunity to work from home. The work schedule is during the day shift from Monday to Friday, and there is a performance bonus offered. If you are interested in applying for this position, please provide the following information: - Number of years of experience in Qlik Sense - Current CTC - Minimum expected CTC - Notice period or availability to join - Present location Work Location: Coimbatore / Remote (Work from Home),

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Database Administrator at RMgX, you will be responsible for designing, implementing, and optimizing database structures to support business applications and data-driven decision-making processes. You will manage and maintain multiple database management systems such as Oracle, SQL Server, and MySQL. Your role will involve developing and executing database queries, stored procedures, and functions while ensuring data security and integrity. Monitoring database performance, troubleshooting issues, and implementing solutions to optimize system efficiency will also be part of your responsibilities. Additionally, you will perform regular database backups, develop disaster recovery plans, and conduct data migrations as needed. Collaboration with development teams to integrate database systems with applications and staying current with emerging database technologies is crucial for success in this role. To qualify for this position, you should have a Bachelor's degree in Computer Science or a related field with 3-5 years of experience in database administration. Proficiency in multiple database management systems (Oracle, SQL Server, MySQL, PostgreSQL) with strong SQL programming skills is required. You should possess in-depth knowledge of database design, implementation, optimization, data modeling, and security best practices. Experience with ETL processes and data warehousing concepts will be advantageous. Excellent problem-solving, analytical, and communication skills are essential, along with an understanding of data privacy regulations and the ability to work efficiently in a fast-paced environment. In addition to a competitive salary, RMgX offers a range of perks and benefits to its employees, including Health Insurance and Personal Accident Insurance, Unlimited Telehealth Consultations, Unlimited Counselling Sessions, Unlimited Dental Consultations, Annual Eye Check-Up, Quarterly Learning Wallet, BYOD (Bring Your Own Device) Benefit, Laptop Buyback Scheme, Work-from-home Opportunity, and Flexible Timings. For more information about RMgX, please visit https://www.rmgx.in/.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Programmer Analyst position is an intermediate level role where you will participate in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include utilizing your knowledge of applications development procedures and concepts to identify and define necessary system enhancements. You will be expected to identify and analyze issues, make recommendations, and implement solutions based on your understanding of business processes, system processes, and industry standards to solve complex issues. Your role will also involve analyzing information to recommend solutions and improvements, conducting testing and debugging, writing basic code for design specifications, and assessing risk when making business decisions. To excel in this role, you should have hands-on experience in designing, developing, and optimizing scalable distributed data processing pipelines using Apache Spark and Scala. You should possess 3 to 6+ years of experience in big data development, focusing on Apache Spark, Scala/Python, and distributed systems. Advanced knowledge of Scala, a good understanding of Python for data engineering tasks, solid understanding of data modeling principles and ETL processes in big data environments, strong analytical and problem-solving skills in analyzing and solving performance issues in Spark jobs and distributed systems, familiarity with Git, Jenkins, and other CI/CD tools for automating the deployment of big data applications, as well as experience with streaming platforms such as Apache Kafka or Spark Streaming. You should hold a Bachelor's degree/University degree or equivalent experience to be considered for this position. This job description provides an overview of the work performed, and other job-related duties may be assigned as required.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled BI Lead with 8-10 years of experience, responsible for overseeing BI initiatives. Your role involves leading the migration of existing reports to Power BI, requiring expertise in Qlik and Power BI tools. You will design data models, integrate data from multiple sources, and provide technical guidance to a BI development team. Your key responsibilities include leading BI projects, managing migration processes, designing complex dashboards, and mentoring team members. You will collaborate with stakeholders to understand reporting needs, optimize BI infrastructure, and enhance existing processes for data governance and reporting. To excel in this role, you must have 8-10 years of BI development experience, proficiency in Qlik and Power BI, and experience in migrating BI solutions. Strong knowledge of data modeling, ETL processes, DAX, Power Query, and SQL is required. You should possess leadership skills, project management experience, and excellent communication abilities. Preferred skills include experience with cloud platforms, Python or R for advanced analytics, Agile methodologies, and knowledge of data visualization principles. Certification in Qlik, Power BI, or related BI tools is advantageous. This is a full-time, permanent position located in Pune. Immediate availability to serve notice is required.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be working as a full-time on-site Oracle OSM Developer and Architect based in Delhi/Bangalore/Pune, India. Your responsibilities will include designing and developing Oracle OSM solutions, data modeling, creating Oracle reports, and managing ETL processes. Collaboration with other software developers is essential to ensure smooth integration and functionality of software systems. To excel in this role, you must have hands-on experience with Core Oracle OSM (Order and Service Management) and UIM (Unified Inventory Management), proficiency in Oracle Reports and ETL processes, a solid software development background, exceptional problem-solving and analytical capabilities, and the ability to work both independently and as part of a team. A Bachelor's degree in Computer Science, Information Technology, or a related field is required for this position.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a skilled Boomi Integration Developer with experience in developing integrations involving UKG Pro Workforce Management (WFM). You have a strong technical background in Dell Boomi, hands-on experience with API integrations, and a functional understanding of UKG Pro WFM to support seamless data flow between systems. Your responsibilities will include designing, building, and managing integrations using Dell Boomi to connect UKG Pro WFM with other enterprise applications. You will work with REST/SOAP APIs to enable real-time and batch data processing. Identifying integration issues, optimizing performance, ensuring high availability of Boomi processes, and ensuring accurate data mapping between UKG Pro WFM and external systems will be part of your daily tasks. Collaboration with business teams, HRIS teams, and IT teams to gather integration requirements and deliver solutions will be essential. Additionally, you will need to ensure that integrations comply with data security and governance policies. To qualify for this role, you should have at least 3 years of experience in technical integrations, focusing on Dell Boomi. Hands-on experience in developing and deploying Boomi processes (Cloud and On-Premise) is required. Strong experience with UKG Pro WFM and understanding of its functional modules (e.g., Timekeeping, Accruals, Scheduling) are also necessary. Experience working with APIs (REST/SOAP), JSON, XML, and Web Services is a must. Knowledge of SQL, data transformations, and ETL processes for integrating various data sources is essential. You should be able to analyze integration requirements and deliver optimized solutions, as well as be familiar with error handling, exception management, and logging in Boomi. Strong problem-solving and troubleshooting skills are highly valued in this role. It would be beneficial if you have familiarity with other iPaaS tools or integration platforms and exposure to cloud platforms like AWS/Azure for hybrid integrations.,

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Scientist and Architect at Citi, you will play a crucial role in designing and architecting high-performance Analytical Dashboard & Performance Metrics used by senior stakeholders across our Banking and International (B&I) franchise. You will collaborate with teams from B&I Data and subject matter experts from Risk/Finance/Technology to design, validate, develop, and deploy both minimum viable products (MVP) and ready-to-deploy solutions. In this role, you will have the opportunity to work on defining, ideating, developing, and distributing analytical dashboard solutions with far-reaching impact for the international business at Citi. You will lead versatile teams of junior data scientists, analysts, and engineers, ensuring the design, build, and enhancement of high-performance, data-intensive solutions from data acquisition to visualization and distribution. Additionally, you will focus on building and enhancing existing data quality processes to maintain accuracy, timeliness, and completeness of various dashboards and reports. Your role will involve coordinating work within the team or project group, liaising with multiple data teams/departments, and serving as a subject matter expert for planning and analysis of project deliverables and processes. To excel in this position, you should possess 15 years of experience in the Banking or Finance industry, with a strong understanding of business analysis, project management, and solution design. Effective communication skills are essential, as you will be required to develop and deliver multi-mode communications that cater to different audiences, driving consensus and influencing relationships at all levels. Proficiency in SQL, Python, Tableau, Java, web development frameworks, and data science concepts is mandatory for this role. As an adaptable, creative, and resourceful team player with a strong work ethic, you will be well-equipped to tackle technical challenges, troubleshoot issues, and drive innovation in a supported, resource-rich environment. A Bachelor's or Master's degree is required for this role, and any additional job-related duties may be assigned as needed. Citi is committed to being an equal opportunity and affirmative action employer, providing qualified individuals with the opportunity to apply for career opportunities. If you require a reasonable accommodation due to a disability during the application process, please review the Accessibility at Citi guidelines.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be joining our team in Noida as a highly analytical and detail-oriented data analytics expert. Your main responsibilities will include designing, writing, and optimizing complex SQL queries, functions, and procedures using PostgreSQL, as well as analyzing large datasets to extract insights for business decision-making. Additionally, you will be developing, maintaining, and publishing dynamic and interactive Power BI dashboards and reports. Your role will involve collaborating with both business and technical teams to understand data requirements and deliver analytics solutions. You will be responsible for ensuring data accuracy, consistency, and performance optimization of analytical queries. It will also be part of your job to create documentation for data models, processes, and reports. To excel in this role, you should have strong hands-on experience with PostgreSQL, including advanced querying, indexing, procedures, and performance tuning. Proficiency in writing complex SQL for large and relational datasets is crucial. Expertise in Power BI, particularly in data modeling, DAX, and visualization best practices, is also required. The ability to translate business needs into data insights, along with a good understanding of ETL processes and data pipelines, will be beneficial. While experience working in Agile/Scrum teams is preferred, it is not mandatory. Qualified candidates will hold a Bachelor's degree in computer science, Information Technology, or a related field. Strong problem-solving and communication skills are essential, as is experience in integrating data from multiple sources.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies