Home
Jobs

4654 Extract Jobs - Page 38

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Join us as an "Assistant Vice President" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as an "Assistant Vice President", you should have experience with: Basic/ Essential Qualifications Expert level hands on experience with data management and its associated tools for maintaining good quality data. Expert level hands on experience with methods to analyze poor quality data and understand data lineage. Sound knowledge of derivative trades, collateralization and netting logics applied in BFSI. Sound knowledge of Reference data and Master data. Sound knowledge of accounting of transactions and its representation in a Bank’s financial statements (P&L and Balance sheet). Strong communication skills and presentation skills to Senior Leaders. Desirable Skillsets/ Good To Have Hands on experience with dashboard development using tools like Tableau. Defined and established KRI and KPI to measure good quality data. Knowledge\Experience in posting eviewing of accounting entries for complex derivative trade structures You may be assessed on the key critical skills relevant for success in role, such as experience with handling data, identifying poor quality issues with data, performing detailed root cause analysis on the poor quality data to identify and recommend solutions to resolve those poor quality data as well as job-specific skillsets. Location-Chennai Purpose of the role To implement data quality process and procedures, ensuring that data is reliable and trustworthy, then extract actionable insights from it to help the organisation improve its operation, and optimise resources. Accountabilities Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. Execution of data cleansing and transformation tasks to prepare data for analysis. Designing and building data pipelines to automate data movement and processing. Development and application of advanced analytical techniques, including machine learning and AI, to solve complex business problems. Documentation of data quality findings and recommendations for improvement. Assistant Vice President Expectations Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Grade : 20E Shift timing : Rotation between 10:30 to 19:30 IST and 11.30 to 20.30 IST Summary This role supports the Global Pricing function by gathering, analyzing, and interpreting data to shape UPS’s pricing strategy. The position involves consulting with internal experts to identify the best analytical approaches, leveraging in-house reporting tools and external data sources to generate insights. The analyst summarizes findings and communicates key takeaways to stakeholders across the organization. The role requires expertise in running queries from multiple data sources, cleansing and validating data, and identifying performance trends and drivers. Additionally, the analyst presents insights in a clear, actionable format to inform management decisions. This position includes managerial responsibilities. Responsibilities Develops subject matter expertise on internal and external data sources for strategic decision-making. Compiles, updates, and distributes reports to stakeholders with timely insights. Conducts data analysis to identify performance trends and key business drivers. Manages ad-hoc data requests and builds structured databases for new reporting needs. Documents and optimizes queries to ensure data integrity and efficiency. Engages with data providers to maintain quality and streamline data processing. Presents findings cross-functionally, highlighting opportunities to enhance business strategies. Qualifications Technical Expertise: Proficiency in Microsoft Office Suite, SQL, and Power BI. UPS Platform Experience (Preferred): Familiarity with DWH, OBI, GCPR, CDP, and Databricks. Educational Background (Preferred): Master’s degree (or equivalent) in Marketing, Analytics, Data Science, or related field. Analytical Skills: Strong ability to conduct analysis, diagnose trends, and develop data-driven solutions. Problem-Solving: Identifies root causes of business challenges and proposes effective solutions. Data Management: Uses analytical tools to manage large datasets and extract meaningful insights. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Grade : 20E Shift timing : Rotation between 10:30 to 19:30 IST and 11.30 to 20.30 IST Summary This role supports the Global Pricing function by gathering, analyzing, and interpreting data to shape UPS’s pricing strategy. The position involves consulting with internal experts to identify the best analytical approaches, leveraging in-house reporting tools and external data sources to generate insights. The analyst summarizes findings and communicates key takeaways to stakeholders across the organization. The role requires expertise in running queries from multiple data sources, cleansing and validating data, and identifying performance trends and drivers. Additionally, the analyst presents insights in a clear, actionable format to inform management decisions. This position includes managerial responsibilities. Responsibilities Develops subject matter expertise on internal and external data sources for strategic decision-making. Compiles, updates, and distributes reports to stakeholders with timely insights. Conducts data analysis to identify performance trends and key business drivers. Manages ad-hoc data requests and builds structured databases for new reporting needs. Documents and optimizes queries to ensure data integrity and efficiency. Engages with data providers to maintain quality and streamline data processing. Presents findings cross-functionally, highlighting opportunities to enhance business strategies. Qualifications Technical Expertise: Proficiency in Microsoft Office Suite, SQL, and Power BI. UPS Platform Experience (Preferred): Familiarity with DWH, OBI, GCPR, CDP, and Databricks. Educational Background (Preferred): Master’s degree (or equivalent) in Marketing, Analytics, Data Science, or related field. Analytical Skills: Strong ability to conduct analysis, diagnose trends, and develop data-driven solutions. Problem-Solving: Identifies root causes of business challenges and proposes effective solutions. Data Management: Uses analytical tools to manage large datasets and extract meaningful insights. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description The Retail Business Services (RBS) group is an integral part of Amazon's online product life-cycle and supports buying operations. The team’s primary role is to support the creation and enhancement of retail selection on the worldwide Amazon online catalog. The tasks handled by this group can impact online user experience. The successful Subject Matter Expert is a problem-solver, mentor and communicator with expertise in process optimizations and systems thinking. You will engage directly with multiple internal teams to drive business projects for the RBS team. You will utilize a wide range of skills and work on operational quality to independently drive the performance improvement projects. In this role you will be focused on improving the experience and satisfaction of Amazon customers (vendors/vendor managers/end customer), root cause analysis of issues and opportunities affecting the business. Key job responsibilities Develop strategies for continuous improvement in process and customer quality. Strengthen the existing process by ensuring identification of automation and upstream defect elimination opportunities. Drive process excellence initiatives, drive Kaizen events and work on new automation / solution building projects Able to drill into large amounts of data and extract meaningful business metrics. Perform data analysis on trends observed and recommend solutions to the product and Business teams Collaborate with partner teams and stakeholders across the globe to deliver on key business goals and objectives by driving consensus and building trust Demonstrates the ability to dive deep into a problem, perform root cause and corrective actions to avoid defect reoccurrence. Establishes key reports for functional area Able to write , well-structured and detail-oriented documents in a clear, concise and audience-specific format The Candidate Is/has Aptitude and interest for Upstream Defect Elimination. Ability to identify, prioritize and coordinate work streams as necessary including prioritizing, scheduling, time management, and meeting deadlines High attention to detail and proven ability to manage multiple, competing priorities simultaneously with minimal supervision About The Team The RBS team is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best price, wide selection and good product information. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled by this group have a direct impact on customer buying decisions and online user experience. Basic Qualifications 1+ years of program or project management experience Experience using data to influence business decisions 1+ years of interacting with customers/stake holders experience Bachelor's degree Knowledge of MS office. Experience working on root cause analysis, corrective and preventive actions for solving customer problems and prevention of defects. Preferred Qualifications Knowledge of analytics & statistical tools such as SAS, PowerBI, SQL & ETL DW concepts Knowledge of visualization tools such as Tableau, Datazen, SSRS Experience back office operations, escalation management and troubleshooting environments Experience working in e-commerce / retail / supply chain / financial services business Worked in a global client facing role. Six sigma green belt certified ISO 9001 lead auditor certified Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3008389 Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Step into a role of Assistant Manager - Fraud Analytics, where you’ll take responsibility for client service and operational execution task. You must take responsibility for controlling risk and enhancing controls in connection to your job and areas of responsibility in accordance with rules and regulations. You must follow well defined procedures that may require a range of job routines and make judgement based on practice and previous experience. To thrive in this role, you’ll need some previous experience in: Bachelor’s degree or equivalent in quantitative field of study (master’s candidate is good to have). Candidate possessing data and analytical experience with problem solving skills. Ability to perform and handle multiple workstreams with deadline driven environment. Working knowledge of SAS, SQL, and Microsoft Excel. Relevant industry experience. Effective communication skills – fluent in English written and spoken. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Noida. Purpose of the role To use innovative data analytics and machine learning techniques to extract valuable insights from the bank's data reserves, leveraging these insights to inform strategic decision-making, improve operational efficiency, and drive innovation across the organisation. Accountabilities Identification, collection, extraction of data from various sources, including internal and external sources. Performing data cleaning, wrangling, and transformation to ensure its quality and suitability for analysis. Development and maintenance of efficient data pipelines for automated data acquisition and processing. Design and conduct of statistical and machine learning models to analyse patterns, trends, and relationships in the data. Development and implementation of predictive models to forecast future outcomes and identify potential risks and opportunities. Collaborate with business stakeholders to seek out opportunities to add value from data through Data Science. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. - Strong understanding of data integration and transformation processes. - Experience with ETL (Extract, Transform, Load) methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and resolve application issues effectively. Additional Information: - The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services. - This position is based at our Hyderabad office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Experience 4+ years. Job Description Experience playing the role of a senior BA who can articulate (written and verbal),negotiate and finalise user stories with client business stakeholders. Good Detailing & Analytics skills with natural interest in problem-solving. Hands on Expertise in producing hands-on deliverables like requirement specifications/user stories, product backlog. Experience in working with Agile teams in role of taking independent role in requirement grooming, sprint planning and working with software product dev team. Experience in working with UI/UX/Visual design to guide the product’s user interface process. Adequate technical exposure to understand modern web/mobile application development along with ability to visualize and understand data structures. Job Duties And Responsibilities Analysis Skills – Natural interest to analysis and problem-solving. Should be able to analyze business problems faced by the client and suggest solutions. Ability to go into details and extract functional details: Understand business objectives by eliciting and studying user needs. Capability to articulate the product requirements (written and verbal) and present it – This may involve preparing mindmaps, other diagrams to convey ideas toclient, designing wireframes, writing user stories and acceptance criteria. Excellent communication skills – articulate (written and verbal), negotiate and finalize product business requirements with client stakeholders. Documentation Skill: Should be excellent in documenting the user stories in a detailedmanner. Knowledge in preparing business process flowcharts using tools. Knowledge in preparing Functional and System Requirement Specification documentation in the form of user stories and acceptance criteria. Familiarity with tools like JIRA/Azure DevOps/AHA etc. Knowledge in creating wireframes using prototyping or wireframing tools. Ability to visualize the User Interface of features, suggest the challenges in usability and provide suggestions to UI/UX team. Work with Software Development, Quality Control / Testing, UI design teams during product development sprints. Testing Skills – Capability to the test the products during development stage and capability to manage User Acceptance Testing and taken a system live. Experience in developing user documentation, providing business support and in training users. Should have good presentation skills. Willingness to Travel to client sites for short – medium duration (1 to 10 weeks). Any Additional Information/Specifics Participate in the full product development cycle, including brainstorming, release planning and estimation, implementing and iterating on code, coordinating with internal and external stakeholders, MVP and production releases, quality assurance, and product support. Highly effective and thrives in a dynamic environment. Comfortable with proactive outward communication, leadership and positive about accepting challenges. To adhere to ISMS policies and procedures. Apply for this position Full Name * Email * Phone * Notice Period * Upload CV/Resume *Allowed Type(s): .pdf, .doc, .docx Where all have you seen Experion? (Select all that applies) * News Social Media Job Portals By using this form you agree with the storage and handling of your data by this website. * Prev Post Senior Software Engineer – .NET Next Post Lead Engineer – Testing Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Your Responsibilities Do you want to join a team passionate about engineering data solutions to inform business decisions and transform business performance? ADM is seeking a Data Engineer, someone who thrives in an environment continuously integrating and deploying data products and analytics driving business value. The Data Engineer is a team player who sees problems as challenges to be creatively solved in collaboration with other data engineers, data scientists, and business colleagues. The ideal candidate is a driven learner who develops and sharpens skills building data pipelines, transforming raw data into useful data systems, and optimizing the data delivery architecture. Learn to design, build, refactor, and maintain data pipelines using Microsoft Azure, Databricks, SAP Datasphere, SQL, Azure Data Factory , Python, and PySpark to meet business requirements for reporting, analysis, and data science Participate in designing, and integrating fault tolerance and enhancements into data pipelines to improve quality and performance Monitor data pipelines using analytic tools to develop actionable insights into performance issues Perform root cause analysis and solve problems using analytical and technical skills to optimize data delivery and reduce costs Adhere to code standards and DataOps and MLOps best practices to accelerate and continuously improve data system performance Your Profile 2+ years proven Data Engineering experience Bachelor’s degree in computer science, software engineering, information technology or equivalent combination of data engineering professional experience and education. Knowledge of Microsoft Azure, SQL, Databricks, SAP Datasphere, Azure Data Factory, Python, PySpark, Power BI or other cloud-based data systems Knowledge of Azure DevOps, GitHub, CI/CD are a plus Working knowledge of relational database systems Task management and organizational skills Knowledge of or demonstrated experience building cloud ETL pipelines using code or ETL platforms utilizing database connections, APIs, or file-based Knowledge of data manipulations and processing techniques to extract value from large, disconnected datasets Continuous learning to upskill data engineering techniques and business acumen #IncludingYou Diversity, equity, inclusion and belonging are cornerstones of ADM’s efforts to continue innovating, driving growth, and delivering outstanding performance. We are committed to attracting and retaining a diverse workforce and create welcoming, truly inclusive work environments — environments that enable every ADM colleague to feel comfortable on the job, make meaningful contributions to our success, and grow their career. We respect and value the unique backgrounds and experiences that each person can bring to ADM because we know that diversity of perspectives makes us better, together. For more information regarding our efforts to advance Diversity, Equity, Inclusion & Belonging, please visit our website here: Diversity, Equity and Inclusion | ADM. About ADM At ADM, we unlock the power of nature to provide access to nutrition worldwide. With industry-advancing innovations, a complete portfolio of ingredients and solutions to meet any taste, and a commitment to sustainability, we give customers an edge in solving the nutritional challenges of today and tomorrow. We’re a global leader in human and animal nutrition and the world’s premier agricultural origination and processing company. Our breadth, depth, insights, facilities and logistical expertise give us unparalleled capabilities to meet needs for food, beverages, health and wellness, and more. From the seed of the idea to the outcome of the solution, we enrich the quality of life the world over. Learn more at www.adm.com. Req/Job ID 97994BR Ref ID Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Thrissur, Kerala, India

On-site

Linkedin logo

About Rylaq Rylaq is pioneering high-value functional mushroom products—ranging from DIY grow-kits to nutraceutical formulations. As we scale our cultivation and extraction operations, we’re seeking an industry veteran to guide our R&D, production protocols, and quality systems. Role Overview As our Director – Mushroom Cultivation & Extraction, you will: Advise on the design and optimization of controlled-environment growth chambers Develop and validate scalable extraction processes for bioactive compounds Establish SOPs and QC/QA benchmarks (e.g., β-glucan assays, microbial limits) Mentor our technical team and liaise with contract growers/labs You’ll be the technical authority driving Rylaq’s upstream (spawn → fruiting) and midstream (harvest → extract) excellence. Key Responsibilities Cultivation Strategy: Recommend substrate formulations (sawdust, coir, bran) and spawn ratios Optimize parameters (temperature, humidity, CO₂, light) for Cordyceps, Lion’s Mane, Reishi, etc. Extraction & Formulation: Design scalable solvent- or water-based extraction workflows Specify concentration targets, solvent recovery, and residue management Quality & Compliance: Draft and implement GMP-level SOPs Define QC tests: potency (HPLC/UV), microbial, heavy metals Ensure FSSAI/ISO/HACCP readiness Team Leadership: Train lab technicians on sterile technique, aseptic transfers, and batch records Coordinate with external contract manufacturers for pre-order kit fulfillment Technology Evaluation: Scout and pilot new bioreactor, pasteurization, and drying technologies Drive continuous improvement via small-scale trials and data analysis Ideal Candidate Profile Experience: 10+ years in mushroom cultivation and/or extraction for nutraceutical or pharmaceutical applications Proven track record scaling from lab to commercial (5 kg→500 kg+/month) Technical Skills: Deep understanding of spawn preparation, sterile culture, substrate pasteurization/sterilization Expertise in extraction techniques (Soxhlet, maceration, supercritical CO₂) Familiarity with downstream processing: concentration, spray drying, encapsulation Regulatory Know-How: Hands-on with FSSAI/DGFT regulations for food/health products GMP, HACCP documentation and audits Soft Skills: Strong coaching and cross-functional communication Data-driven mindset with proficiency in basic statistical tools Entrepreneurial spirit and comfort in a fast-moving startup What We Offer Strategic advisory equity or profit-share package Opportunity to shape a novel functional mushroom brand from the ground up Collaboration with a passionate, multidisciplinary team Show more Show less

Posted 1 week ago

Apply

7.5 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : IBM Netezza Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Netezza. - Good To Have Skills: Experience with data warehousing concepts and practices. - Strong understanding of ETL processes and data integration techniques. - Familiarity with data modeling and database design principles. - Experience with performance tuning and optimization of data queries. Additional Information: - The candidate should have minimum 7.5 years of experience in IBM Netezza. - This position is based in Pune. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Why CDM Smith? Check out this video and find out why our team loves to work here! Join Us! CDM Smith – where amazing career journeys unfold. Imagine a place committed to offering an unmatched employee experience. Where you work on projects that are meaningful to you. Where you play an active part in shaping your career journey. Where your co-workers are invested in you and your success. Where you are encouraged and supported to do your very best and given the tools and resources to do so. Where it’s a priority that the company takes good care of you and your family. Our employees are the heart of our company. As an employer of choice, our goal is to provide a challenging, progressive and inclusive work environment which fosters personal leadership, career growth and development for every employee. We value passionate individuals who challenge the norm, deliver world-class solutions and bring diverse perspectives. Join our team, and together we will make a difference and change the world. Job Description CDM Smith is seeking a Data Engineer to join our Digital Engineering Solutions team. This individual will be part of the Data Technology group within the Digital Engineering Solutions team, helping to drive strategic Architecture, Engineering and Construction (AEC) initiatives using cutting-edge data technologies and analytics to deliver actionable business insights and robust solutions for AEC professionals and client outcomes. The Data Technology group will lead the firm in AEC-focused Business Intelligence and data services by providing architectural guidance, technological vision, and solution development. The Data Technology group will specifically utilize advanced analytics, data science, and AI/ML to give our business and our products a competitive advantage. It includes understanding and managing the data, how it interconnects, and architecting & engineering data for self-serve BI and BA opportunities. This position is for a person who has demonstrated excellence in data engineering capabilities, experienced with data technology and processes, and enjoys framing a problem, shaping and creating solutions, and helping to lead and champion implementation. As a member of the Digital Engineering Solutions team, the Data Technology group will also engage in research and development and provide guidance and oversight to the AEC practices at CDM Smith, engaging in new product research, testing, and the incubation of data technology-related ideas that arise from around the company. Key Responsibilities Assists in the design, development, and maintenance of scalable data pipelines and workflows to extract, transform, and load (ETL/ELT) data from various sources into target systems. Contributes to automate workflows to ensure efficiency, scalability, and error reduction in data integration processes. Tests data quality and integrity by implementing processes to validate completeness, accuracy, and consistency of data. Monitor and troubleshoot data pipeline performance and reliability. Document data engineering processes and workflows. Collaborate with Data Scientists, Analytics Engineers, and stakeholders to understand business requirements and deliver high-quality data solutions. Stay abreast of the latest developments and advancements, including new and emerging technologies & best practices and new tools & software applications and how they could impact CDM Smith. Assist with the development of documentation, standards, best practices, and workflows for data technology hardware/software in use across the business. Perform other duties as required. Skills And Abilities Good foundation with the Software Development Life Cycle (SDLC) and Agile Development methodologies. Good foundation with Cloud ETL/ELT tools and deployment, including Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. Good Knowledge in data modeling and designing scalable ETL/ELT processes. Familiarity with CI/CD pipelines and DevOps practices for data solutions. Knowledge of monitoring tools and techniques for ensuring pipeline observability and reliability. Excellent problem-solving and critical thinking skills to identify and address technical challenges effectively. Strong critical thinking skills to generate innovative solutions and improve business processes. Ability to effectively communicate complex technical concepts to both technical and non-technical audiences. Detail oriented with the ability to assist with executing highly complex or specialized projects. Minimum Qualifications Bachelor’s degree. 0 – 2 years of related experience. Equivalent additional directly related experience will be considered in lieu of a degree. Amount Of Travel Required 0% Background Check and Drug Testing Information CDM Smith Inc. and its divisions and subsidiaries (hereafter collectively referred to as “CDM Smith”) reserves the right to require background checks including criminal, employment, education, licensure, etc. as well as credit and motor vehicle when applicable for certain positions. In addition, CDM Smith may conduct drug testing for designated positions. Background checks are conducted after an offer of employment has been made in the United States. The timing of when background checks will be conducted on candidates for positions outside the United States will vary based on country statutory law but in no case, will the background check precede an interview. CDM Smith will conduct interviews of qualified individuals prior to requesting a criminal background check, and no job application submitted prior to such interview shall inquire into an applicant's criminal history. If this position is subject to a background check for any convictions related to its responsibilities and requirements, employment will be contingent upon successful completion of a background investigation including criminal history. Criminal history will not automatically disqualify a candidate. In addition, during employment individuals may be required by CDM Smith or a CDM Smith client to successfully complete additional background checks, including motor vehicle record as well as drug testing. Agency Disclaimer All vendors must have a signed CDM Smith Placement Agreement from the CDM Smith Recruitment Center Manager to receive payment for your placement. Verbal or written commitments from any other member of the CDM Smith staff will not be considered binding terms. All unsolicited resumes sent to CDM Smith and any resume submitted to any employee outside of CDM Smith Recruiting Center Team (RCT) will be considered property of CDM Smith. CDM Smith will not be held liable to pay a placement fee. Business Unit COR Group COR Assignment Category Fulltime-Regular Employment Type Regular Show more Show less

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Makarba, Ahmedabad, Gujarat

On-site

Indeed logo

Keep up with current hiring trends, industry language, and US employment norms. Collaborate with recruiters or candidates to understand job targets, roles, and employer expectations. Create tailored and professional resumes for candidates seeking opportunities in the US job market. Analyze clients’ current resumes and job histories to extract relevant information and reframe it effectively. Highlight key accomplishments, skills, and qualifications aligned with specific job descriptions. Job Types: Full-time, Permanent Pay: ₹9,632.66 - ₹23,708.88 per month Benefits: Paid sick time Schedule: Evening shift Fixed shift Monday to Friday Night shift US shift Experience: resume creating: 1 year (Preferred) Location: Makarba, Ahmedabad, Gujarat (Preferred) Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include: Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile Job Description * Job Title: ETL Ingestion Engineer (Azure Data Factory) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 2–5 years About The Role We are looking for a talented ETL Ingestion Engineer with hands-on experience in Azure Data Factory (ADF) to join our Data Engineering team. An individual will be responsible for building, orchestrating, and maintaining robust data ingestion pipelines from various source systems into our data lake or data warehouse environments. Key Responsibilities Design and implement scalable data ingestion pipelines using Azure Data Factory (ADF). Extract data from a variety of sources such as SQL Server, flat files, APIs, and cloud storage. Develop ADF pipelines and data flows to support both batch and incremental loads. Ensure data quality, consistency, and reliability throughout the ETL process. Optimize ADF pipelines for performance, cost, and scalability. Monitor pipeline execution, troubleshoot failures, and ensure data availability meets SLAs. Document pipeline logic, source-target mappings, and operational procedures. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in ETL development and data pipeline implementation. Strong hands-on experience with Azure Data Factory (ADF), including linked services, datasets, pipelines, and triggers. Proficiency in SQL and working with structured and semi-structured data (CSV, JSON, Parquet). Experience with Azure storage systems (ADLS Gen2, Blob Storage) and data movement. Familiarity with job monitoring and logging mechanisms in Azure. Preferred Skills Experience with Azure Data Lake, Synapse Analytics, or Databricks. Exposure to Azure DevOps for CI/CD in data pipelines. Understanding of data governance, lineage, and compliance requirements (GDPR, HIPAA, etc.). Knowledge of RESTful APIs and API-based ingestion. Job Title: ETL Lead – Azure Data Factory (ADF) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 5+ years About The Role We are seeking an experienced ETL Lead with strong expertise in Azure Data Factory (ADF) to lead and oversee data ingestion and transformation projects across the organization. The role demands a mix of technical proficiency and leadership to design scalable data pipelines, manage a team of engineers, and collaborate with cross-functional stakeholders to ensure reliable data delivery. Key Responsibilities Lead the design, development, and implementation of end-to-end ETL pipelines using Azure Data Factory (ADF). Architect scalable ingestion solutions from multiple structured and unstructured sources (e.g., SQL Server, APIs, flat files, cloud storage). Define best practices for ADF pipeline orchestration, performance tuning, and cost optimization. Mentor, guide, and manage a team of ETL engineers—ensuring high-quality deliverables and adherence to project timelines. Work closely with business analysts, data modelers, and source system owners to understand and translate data requirements. Establish data quality checks, monitoring frameworks, and alerting mechanisms. Drive code reviews, CI/CD integration (using Azure DevOps), and documentation standards. Own delivery accountability across multiple ingestion and data integration workstreams. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or related discipline. 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Deep understanding of data ingestion, transformation, and warehousing best practices. Strong SQL skills and experience with cloud-native data storage (ADLS Gen2, Blob Storage). Proficiency in orchestrating complex data flows, parameterized pipelines, and incremental data loads. Experience in handling large-scale data migration or modernization projects. Preferred Skills Familiarity with modern data platforms like Azure Synapse, Snowflake, Databricks. Exposure to Azure DevOps pipelines for CI/CD of ADF pipelines and linked services. Understanding of data governance, security (RBAC), and compliance requirements. Experience leading Agile teams and sprint-based delivery models. Excellent communication, leadership, and stakeholder management skills. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are a proud work-from-office company. If you're ready to work on-site in a dynamic, global company, we’d love to hear from you. Position Summary Do you have a passion for building data architectures that enable smooth and seamless product experiences? Are you an all-around data enthusiast with a knack for ETL? We're hiring Data Engineers to help build and optimize the foundational architecture of our product's data. We’ve built a strong data engineering team to date, but have a lot of work ahead of us, including: Migrating from relational databases to a streaming and big data architecture, including a complete overhaul of our data feeds Defining streaming event data feeds required for real-time analytics and reporting Leveling up our platform, including enhancing our automation, test coverage, observability, alerting, and performance As a Data Engineer, you will work with the development team to construct a data streaming platform and data warehouse that serves as the data foundations for our product. Help us scale our business to meet the needs of our growing customer base and develop new products on our platform. You'll be a critical part of our growing company, working on a cross-functional team to implement best practices in technology, architecture, and process. You’ll have the chance to work in an open and collaborative environment, receive hands-on mentorship and have ample opportunities to grow and accelerate your career! Responsibilities Build our next generation data warehouse Build our event stream platform Translate user requirements for reporting and analysis into actionable deliverables Enhance automation, operation, and expansion of real-time and batch data environment Manage numerous projects in an ever-changing work environment Extract, transform, and load complex data into the data warehouse using cutting-edge technologies Build processes for topnotch security, performance, reliability, and accuracy Provide mentorship and collaborate with fellow team members Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, Operations Research, or related field required 3+ years of experience building data pipelines 3+ years of experience building data frameworks for unit testing, data lineage tracking, and automation Fluency in Scala is required Working knowledge of Apache Spark Familiarity with streaming technologies (e.g., Kafka, Kinesis, Flink) Nice-to-Haves Experience with Machine Learning Familiarity with Looker a plus Knowledge of additional server-side programming languages (e.g. Golang, C#, Ruby) PrismHR is a fast-paced SaaS company which provides customers with a cloud-based payroll process software application. PrismHR also provides professional services including system implementation consulting, custom configurations, and training. Lastly, via the Company’s Marketplace platform customers and end users access other human resources and employee benefits applications from PrismHR’s Marketplace Partners. Diversity, Equity And Inclusion Program/Affirmative Action Plan We have transformed our company into an inclusive environment where individuals are valued for their talents and empowered to reach their fullest potential. At PrismHR, we strive to continually lead with our values and beliefs that enable our employees to develop their potential, bring their full self to work, and engage in a world of inclusion. Ensuring an inclusive environment for our employees is an integral part of the PrismHR culture. We aren't just checking a box, we are truly committed to creating a workplace that celebrates the diversity of our employees and fosters a sense of belonging for everyone. This is essential to our success. We are dedicated to building a diverse, inclusive, and authentic workplace, so if you’re excited about our roles but your past experience doesn’t align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right candidate for these open roles or other open roles. We particularly encourage applicants from traditionally under-represented groups as we seek to increase the diversity of our workforce and provide fair opportunities for all. As a proud Equal Opportunity and Affirmative Action Employer, PrismHR encourages talent from all backgrounds to join our team. Employment decisions are based on an individual’s qualifications as they relate to the job under consideration. The Company’s policy prohibits unlawful discrimination based on sex (which includes pregnancy, childbirth, breastfeeding, or related medical conditions, the actual sex of the individual, or the gender identity or gender expression), race, color, religion, including religious dress practices and religious grooming practices, sexual orientation, national origin, ancestry, citizenship, marital status, familial status, age, physical disability, mental disability, medical condition, genetic information, protected veteran or military status, or any other consideration made unlawful by federal, state or local laws, ordinances, or regulations. The Company is committed to complying with all applicable laws providing equal employment opportunities. This commitment applies to all persons involved in the operations of the Company and prohibits unlawful discrimination by any employee of the Company, including supervisors and co-workers. Privacy Policy: For information about how we collect and use your personal information, please see our privacy statement available at https://www.prismhr.com/about/privacy-policy. PrismHR provides reasonable accommodation for qualified individuals with disabilities and disabled veterans in job application procedures. If you have any difficulty using our online system and you need a reasonable accommodation due to a disability, you may use the following alternative email address to contact us about your interest in employment at PrismHR: taglobal@prismhr.com. Please indicate in the subject line of your email that you are requesting accommodation. Only candidates being considered for a position who require an accommodation will receive a follow-up response. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include: Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile Job Description * Job Title: ETL Ingestion Engineer (Azure Data Factory) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 2–5 years About The Role We are looking for a talented ETL Ingestion Engineer with hands-on experience in Azure Data Factory (ADF) to join our Data Engineering team. An individual will be responsible for building, orchestrating, and maintaining robust data ingestion pipelines from various source systems into our data lake or data warehouse environments. Key Responsibilities Design and implement scalable data ingestion pipelines using Azure Data Factory (ADF). Extract data from a variety of sources such as SQL Server, flat files, APIs, and cloud storage. Develop ADF pipelines and data flows to support both batch and incremental loads. Ensure data quality, consistency, and reliability throughout the ETL process. Optimize ADF pipelines for performance, cost, and scalability. Monitor pipeline execution, troubleshoot failures, and ensure data availability meets SLAs. Document pipeline logic, source-target mappings, and operational procedures. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in ETL development and data pipeline implementation. Strong hands-on experience with Azure Data Factory (ADF), including linked services, datasets, pipelines, and triggers. Proficiency in SQL and working with structured and semi-structured data (CSV, JSON, Parquet). Experience with Azure storage systems (ADLS Gen2, Blob Storage) and data movement. Familiarity with job monitoring and logging mechanisms in Azure. Preferred Skills Experience with Azure Data Lake, Synapse Analytics, or Databricks. Exposure to Azure DevOps for CI/CD in data pipelines. Understanding of data governance, lineage, and compliance requirements (GDPR, HIPAA, etc.). Knowledge of RESTful APIs and API-based ingestion. Job Title: ETL Lead – Azure Data Factory (ADF) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 5+ years About The Role We are seeking an experienced ETL Lead with strong expertise in Azure Data Factory (ADF) to lead and oversee data ingestion and transformation projects across the organization. The role demands a mix of technical proficiency and leadership to design scalable data pipelines, manage a team of engineers, and collaborate with cross-functional stakeholders to ensure reliable data delivery. Key Responsibilities Lead the design, development, and implementation of end-to-end ETL pipelines using Azure Data Factory (ADF). Architect scalable ingestion solutions from multiple structured and unstructured sources (e.g., SQL Server, APIs, flat files, cloud storage). Define best practices for ADF pipeline orchestration, performance tuning, and cost optimization. Mentor, guide, and manage a team of ETL engineers—ensuring high-quality deliverables and adherence to project timelines. Work closely with business analysts, data modelers, and source system owners to understand and translate data requirements. Establish data quality checks, monitoring frameworks, and alerting mechanisms. Drive code reviews, CI/CD integration (using Azure DevOps), and documentation standards. Own delivery accountability across multiple ingestion and data integration workstreams. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or related discipline. 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Deep understanding of data ingestion, transformation, and warehousing best practices. Strong SQL skills and experience with cloud-native data storage (ADLS Gen2, Blob Storage). Proficiency in orchestrating complex data flows, parameterized pipelines, and incremental data loads. Experience in handling large-scale data migration or modernization projects. Preferred Skills Familiarity with modern data platforms like Azure Synapse, Snowflake, Databricks. Exposure to Azure DevOps pipelines for CI/CD of ADF pipelines and linked services. Understanding of data governance, security (RBAC), and compliance requirements. Experience leading Agile teams and sprint-based delivery models. Excellent communication, leadership, and stakeholder management skills. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading staffing firm dedicated to connecting talented individuals with exceptional organizations. We focus on delivering innovative and effective workforce solutions tailored to meet the unique needs of our clients. As a company, we prioritize collaboration, integrity, and excellence, ensuring that we create the best opportunities for both our candidates and clients. Job Title: GCP Data Engineer Location: India (On-site) Role Responsibilities Design, develop, and maintain robust data pipelines on Google Cloud Platform (GCP). Implement ETL processes to extract data from various sources. Optimize data storage solutions and ensure proper data architecture. Collaborate with data scientists and analysts to provide insights through analytics. Monitor and troubleshoot data issues for continuous improvement. Ensure data quality and governance throughout the data lifecycle. Participate in the architecture and schema design of data lakes and warehouses. Create and manage BigQuery datasets, tables, and views. Automate repetitive data tasks to streamline workflows. Document data processes and protocols for future reference. Conduct regular performance tuning of data pipelines and storage. Participate in code reviews and provide support to junior team members. Stay updated on the latest GCP features and industry trends. Work closely with stakeholders to understand data requirements. Provide technical support and guidance on data-related projects. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum 3 years of experience in a data engineering role. Strong understanding of Google Cloud Platform services. Experience with SQL and NoSQL databases. Proficient in programming languages such as Python. Familiarity with data modeling techniques. Hands-on experience with ETL tools and frameworks. Strong analytical and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Excellent verbal and written communication skills. Experience with data visualization tools is a plus. Knowledge of data governance frameworks. Understanding of agile methodologies. Ability to work collaboratively in a team environment. Willingness to continuously learn and adapt to new technologies. Certification in GCP or relevant technologies is preferred. Skills: nosql databases,cloud storage,data modeling,sql,gcp,data engineer,data architecture,sql proficiency,google cloud platform (gcp),data visualization tools,etl processes,data governance,bigquery,python,pyspark,agile methodologies Show more Show less

Posted 1 week ago

Apply

2.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. At Target, we have a timeless purpose and a proven strategy and that hasn’t happened by accident. Some of the best minds from diverse backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target’s global team and has more than 4,000 team members supporting the company’s global strategy and operations. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values diverse backgrounds. We believe your unique perspective is important, and you'll build relationships by being authentic and respectful. At Target, inclusion is part of the core value. We aim to create equitable experiences for all, regardless of their dimensions of difference. As an equal opportunity employer, Target provides diverse opportunities for everyone to grow and win. Fueling the continued success of one of the world’s most beloved and recognized brands is a distinctly capable, creative and innovative Marketing organization well known for inspiring and surprising guests and we pride ourselves on connecting them to the products and experiences they expect and deserve from Target. Roundel, Target’s retail media network, offers the world’s leading advertisers industry leading digital advertising solutions. More than ¾ of American adults shop Target, which translates into incredible scale for advertisers to connect directly with our guests and deliver best-in-class marketing outcomes! We design and deliver impactful marketing for brands and their agencies resulting in engagement Pyramid Overview At its core, Roundel is about using Target’s rich insights to create smart, personalized advertising campaigns that bring guests more of the products and offers they love. That moment when guests are prepping for their Target run and see exactly the right online offer at just the right time? That’s Roundel. It works on Target’s platforms, like Target.com and our mobile app, as well as going beyond to connect our partners with guests across more than 150 premium publishers and channels (think: Pinterest, PopSugar and NBC Universal). We work with some of the largest brands and advertising agencies in the world to create a unique experience for our guests’ digital journey. As an Analyst Performance and Insights, you will be responsible for driving cutting edge analysis for vendors running advertising campaigns with Roundel (Target’s Media Network). The core competency of this role is to handle performance analysis for multiple clients, identify growth opportunities, triangulate data from various sources to drive campaign performance to address marketing objectives, and use data storytelling to influence the vendor’s media strategy/investment decisions. Responsibilities Provide mid-flight, ad hoc and end of campaign reporting for digital campaigns; consider past campaign performance, similar campaign objectives, and category benchmarks Combining the individual recaps of the vendor to generate Quarterly/annual recaps and create category level insights Identify the key-metrics, combine them with observations to translate into strategy/vendor insights, adding value to the overall plan Observe and evaluate trends of media campaigns and provide recommendations for optimization tactics and future plans to drive effectiveness Stay updated with the over-all trend and guest behavior in the retail industry and being able to relate the results to derive market level insights Ensure data accuracy, as well as reporting output quality control. Troubleshoot and identify root causes for data inaccuracy- manual v/s system errors Identify, select and extract relevant data from various internal and external sources Independently work on raw data sets into information fit for analysis. Proactively recommend innovative ideas and opportunities About You Minimum 2-7 years of experience in media or related domain. Strong communication skills and desire to work in cross-functional groups; strong writing skills and presentation skills to engage and influence audiences/client decisions Ability to comprehend advertising metrics (i.e., understand true value of ROI, impact of results, compare actual results to benchmarks) and draw inferences to build forward looking recommendations Exceptional attention to detail, organizational and analytical skills. Ability to multi-task and work within a rapidly changing environment. Continuous drive to improve performance by deriving actionable insight from datasets. Exceptional understanding of the digital measurement space, analytics tools/pixels Knowledge of ad serving, Ad Networks and advertising/media landscape required. Familiar with reporting dimensions and metrics of various ad/reporting servers- DFP, DCM, FB ad manager, Criteo etc. along with expertise in Microsoft Excel, DOMO, Tableau, Adobe Site Catalyst Useful Links Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading staffing firm dedicated to connecting talented individuals with exceptional organizations. We focus on delivering innovative and effective workforce solutions tailored to meet the unique needs of our clients. As a company, we prioritize collaboration, integrity, and excellence, ensuring that we create the best opportunities for both our candidates and clients. Job Title: GCP Data Engineer Location: India (On-site) Role Responsibilities Design, develop, and maintain robust data pipelines on Google Cloud Platform (GCP). Implement ETL processes to extract data from various sources. Optimize data storage solutions and ensure proper data architecture. Collaborate with data scientists and analysts to provide insights through analytics. Monitor and troubleshoot data issues for continuous improvement. Ensure data quality and governance throughout the data lifecycle. Participate in the architecture and schema design of data lakes and warehouses. Create and manage BigQuery datasets, tables, and views. Automate repetitive data tasks to streamline workflows. Document data processes and protocols for future reference. Conduct regular performance tuning of data pipelines and storage. Participate in code reviews and provide support to junior team members. Stay updated on the latest GCP features and industry trends. Work closely with stakeholders to understand data requirements. Provide technical support and guidance on data-related projects. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum 3 years of experience in a data engineering role. Strong understanding of Google Cloud Platform services. Experience with SQL and NoSQL databases. Proficient in programming languages such as Python. Familiarity with data modeling techniques. Hands-on experience with ETL tools and frameworks. Strong analytical and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Excellent verbal and written communication skills. Experience with data visualization tools is a plus. Knowledge of data governance frameworks. Understanding of agile methodologies. Ability to work collaboratively in a team environment. Willingness to continuously learn and adapt to new technologies. Certification in GCP or relevant technologies is preferred. Skills: nosql databases,cloud storage,data modeling,sql,gcp,data engineer,data architecture,sql proficiency,google cloud platform (gcp),data visualization tools,etl processes,data governance,bigquery,python,pyspark,agile methodologies Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading staffing firm dedicated to connecting talented individuals with exceptional organizations. We focus on delivering innovative and effective workforce solutions tailored to meet the unique needs of our clients. As a company, we prioritize collaboration, integrity, and excellence, ensuring that we create the best opportunities for both our candidates and clients. Job Title: GCP Data Engineer Location: India (On-site) Role Responsibilities Design, develop, and maintain robust data pipelines on Google Cloud Platform (GCP). Implement ETL processes to extract data from various sources. Optimize data storage solutions and ensure proper data architecture. Collaborate with data scientists and analysts to provide insights through analytics. Monitor and troubleshoot data issues for continuous improvement. Ensure data quality and governance throughout the data lifecycle. Participate in the architecture and schema design of data lakes and warehouses. Create and manage BigQuery datasets, tables, and views. Automate repetitive data tasks to streamline workflows. Document data processes and protocols for future reference. Conduct regular performance tuning of data pipelines and storage. Participate in code reviews and provide support to junior team members. Stay updated on the latest GCP features and industry trends. Work closely with stakeholders to understand data requirements. Provide technical support and guidance on data-related projects. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum 3 years of experience in a data engineering role. Strong understanding of Google Cloud Platform services. Experience with SQL and NoSQL databases. Proficient in programming languages such as Python. Familiarity with data modeling techniques. Hands-on experience with ETL tools and frameworks. Strong analytical and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Excellent verbal and written communication skills. Experience with data visualization tools is a plus. Knowledge of data governance frameworks. Understanding of agile methodologies. Ability to work collaboratively in a team environment. Willingness to continuously learn and adapt to new technologies. Certification in GCP or relevant technologies is preferred. Skills: nosql databases,cloud storage,data modeling,sql,gcp,data engineer,data architecture,sql proficiency,google cloud platform (gcp),data visualization tools,etl processes,data governance,bigquery,python,pyspark,agile methodologies Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Job Summary Fiche de poste : UPS Enterprise Data Analytics team is looking for a talented and motivated Data Scientist to use statistical modelling, state of the art AI tools and techniques to solve complex and large-scale business problems for UPS operations. This role would also support debugging and enhancing existing AI applications in close collaboration with the Machine Learning Operations team. This position will work with multiple stakeholders across different levels of the organization to understand the business problem, develop and help implement robust and scalable solutions. You will be in a high visibility position with the opportunity to interact with the senior leadership to bring forth innovation within the operational space for UPS. Success in this role requires excellent communication to be able to present your cutting-edge solutions to both technical and business leaderships. Responsibilities Become a subject matter expert on UPS business processes and data to help define and solve business needs using data, advanced statistical methods and AI Be actively involved in understanding and converting business use cases to technical requirements for modelling. Query, analyze and extract insights from large-scale structured and unstructured data from different data sources utilizing different platforms, methods and tools like BigQuery, Google Cloud Storage, etc. Understand and apply appropriate methods for cleaning and transforming data, engineering relevant features to be used for modelling. Actively drive modelling of business problem into ML/AI models, work closely with the stakeholders for model evaluation and acceptance. Work closely with the MLOps team to productionize new models, support enhancements and resolving any issues within existing production AI applications. Prepare extensive technical documentation, dashboards and presentations for technical and business stakeholders including leadership teams. Qualifications Expertise in Python, SQL. Experienced in using data science-based packages like scikit-learn, numpy, pandas, tensorflow, keras, statsmodels, etc. Strong understanding of statistical concepts and methods (like hypothesis testing, descriptive stats, etc.), machine learning techniques for regression, classification, clustering problems, including neural networks and deep learning. Proficient in using GCP tools like Vertex AI, BigQuery, GCS, etc. for model development and other activities in the ML lifecycle. Strong ownership and collaborative qualities in the relevant domain. Takes initiative to identify and drive opportunities for improvement and process streamline. Solid oral and written communication skills, especially around analytical concepts and methods. Ability to communicate data through a story framework to convey data-driven results to technical and non-technical audience. Master’s Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications NLP, Gen AI, LLM knowledge/experience Knowledge of Operations Research methodologies and experience with packages like CPLEX, PULP, etc. Knowledge and experience in MLOps principles and tools in GCP. Experience working in an Agile environment, understanding of Lean Agile principles. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary: UPS Enterprise Data Analytics team is looking for a talented and motivated Data Scientist to use statistical modelling, state of the art AI tools and techniques to solve complex and large-scale business problems for UPS operations. This role would also support debugging and enhancing existing AI applications in close collaboration with the Machine Learning Operations team. This position will work with multiple stakeholders across different levels of the organization to understand the business problem, develop and help implement robust and scalable solutions. You will be in a high visibility position with the opportunity to interact with the senior leadership to bring forth innovation within the operational space for UPS. Success in this role requires excellent communication to be able to present your cutting-edge solutions to both technical and business leaderships. Responsibilities Become a subject matter expert on UPS business processes and data to help define and solve business needs using data, advanced statistical methods and AI Be actively involved in understanding and converting business use cases to technical requirements for modelling. Query, analyze and extract insights from large-scale structured and unstructured data from different data sources utilizing different platforms, methods and tools like BigQuery, Google Cloud Storage, etc. Understand and apply appropriate methods for cleaning and transforming data, engineering relevant features to be used for modelling. Actively drive modelling of business problem into ML/AI models, work closely with the stakeholders for model evaluation and acceptance. Work closely with the MLOps team to productionize new models, support enhancements and resolving any issues within existing production AI applications. Prepare extensive technical documentation, dashboards and presentations for technical and business stakeholders including leadership teams. Qualifications Expertise in Python, SQL. Experienced in using data science-based packages like scikit-learn, numpy, pandas, tensorflow, keras, statsmodels, etc. Strong understanding of statistical concepts and methods (like hypothesis testing, descriptive stats, etc.), machine learning techniques for regression, classification, clustering problems, including neural networks and deep learning. Proficient in using GCP tools like Vertex AI, BigQuery, GCS, etc. for model development and other activities in the ML lifecycle. Strong ownership and collaborative qualities in the relevant domain. Takes initiative to identify and drive opportunities for improvement and process streamline. Solid oral and written communication skills, especially around analytical concepts and methods. Ability to communicate data through a story framework to convey data-driven results to technical and non-technical audience. Master’s Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications NLP, Gen AI, LLM knowledge/experience Knowledge of Operations Research methodologies and experience with packages like CPLEX, PULP, etc. Knowledge and experience in MLOps principles and tools in GCP. Experience working in an Agile environment, understanding of Lean Agile principles. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Andaman and Nicobar Islands, India

Remote

Linkedin logo

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Job Description Position Title : Contract Admin Noida, India ROLE SUMMARY : Process standard and/or business system transactions from Quote to Customer Order Entry through closure. Identifies opportunities for improvements while following the established global processes & procedures for maximizing the process efficiency. Key Responsibilities Execute processes in compliance with established SOPs and guidelines. Performs a wide variety of billing administration duties for assigned team(s) globally. Works with contracts admins, field service engineers and regional SPOCs to ensure compliance in the service contracts business. Normally follows established procedures on routine work, requires instructions only on new assignments. Should have exposure around audits and services contracts. Receives assignments in the form of objectives with goals and the process by which to meet goals. Maintains the confidentiality of sensitive and proprietary technical, financial, and commercial information. Prior experience & Knowledge of SAP and IFS Should have required knowledge on commercial aspects of Proposals, Procurements, Contracts, and closures. Applies acquired job skills and company policies and procedures to complete assigned tasks Extract data & publish necessary reports required by the various regional stakeholders as per the established cadence. Contribute to Root cause analysis for any deviation highlighted by the regional stakeholders, required documentation of the said deviations and responding to audit processes as & when required. Use the various official tools available like conference calls & emails for frequent interaction with peers/ customers & regional stakeholders on discussion related to processes, contracts, feedback, presentations & other updates on a regular basis. Interact with vendor regional SPOC and Contracts Team Members to assure project progress to meet customer requirements. Proficiency in MS Office and Quick base Application. Education & Experience The Essentials - You Will Have: Bachelor's Degree in Science/ Commerce/ Business Administration or equivalent. 3+ years of experience in Business Operations. Should have exposure to Global Work style, engagement with clients while sitting at remote locations. Willing to work in flexible business shifts including NA/LA time zone to drive business stakeholder connect. Lean Six Sigma certifications are a plus. The Preferred - You Might Also Have Act as a Point of Contact for acknowledging & addressing internal customers queries related to Projects & Contracts. Participate & actively contribute to continuous improvement initiatives & reporting/documenting enhancements to improve productivity. Interpersonal Regularly interacts with project/contract administrators, Regional SPOCs, and other Business Unit stakeholders to build productive internal/external working relationships. Strong passion for delivering excellent customer experience. Excellent communicator at all levels (in person, written, telephone) with strong ability to clearly articulate & convey the understanding to peers & customers. Reports to : Team Lead Keywords SAP, Supply Chain Management, Project Tracking, Service Management, Order Management, Order Processing, Audits, Invoice Processing, Material Tracking, Customer Relationship Management, Microsoft Office Tools. What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About the job HERE Technologies is a location data and technology platform company. We empower our customers to achieve better outcomes – from helping a city manage its infrastructure or a business optimize its assets to guiding drivers to their destination safely. At HERE we take it upon ourselves to be the change we wish to see. We create solutions that fuel innovation, provide opportunity and foster inclusion to improve people’s lives. If you are inspired by an open world and driven to create positive change, join us. Learn more about us on our YouTube Channel. In this this position you will part of HERE’s Places Ingestion team, which is responsible for discovering Points Of Interest (Places) by processing large volumes of raw data from a variety of sources to improve the content coverage, accuracy, and freshness. You will be part of an energetic and dedicated team that works on challenging tasks in distributed processing of large data & streaming technologies. In addition to the technical challenges this position offers, you will have every opportunity to expand your career both technically and personally in this role. Whats the role: You will help design and build the next iteration of processes to improve quality of Place attributes employing machine learning. You will maintain up-to-date knowledge of research activities in the general fields of machine learning and LLMs. Utilize machine learning algorithms/LLMs to generate translation/transliterations, standardization/derivation rules, extract place attributes such as name, address, category and hours of operations from web sites using web scraping solutions. Participate in both algorithm and software developments as a part of a scrum team, and contribute artifacts (software, white-paper, datasets) for project reviews and demos. Collaborate with internal and external team members (researchers and engineers) on expertly implementing the new features to the products or enhancing the existing features. With end-to-tend aspects like developing, testing, and deploying. Who Are you? You are determined and have the following to be successful in the role: MS or PhD in a discipline such as Statistics, Applied Mathematics, Computer Science, Data Science, or others with an emphasis or thesis work on one or more of the following areas: statistics/science/engineering, data analysis, machine learning, LLMs 3+ years of experience in Data Science field. Proficient with at least one of the deep learning frameworks like Tensorflow, Keras and Pytorch. Programming experience with Python, shell script Applied statistics or experimentation (i.e. A/B testing, root cause analysis, etc) Unsupervised Machine learning methods (i.e. clustering, Bayesian, etc) HERE is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, age, gender identity, sexual orientation, marital status, parental status, religion, sex, national origin, disability, veteran status, and other legally protected characteristics Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Job Description Position Title : Contract Admin Noida, India ROLE SUMMARY : Process standard and/or business system transactions from Quote to Customer Order Entry through closure. Identifies opportunities for improvements while following the established global processes & procedures for maximizing the process efficiency. Key Responsibilities Execute processes in compliance with established SOPs and guidelines. Performs a wide variety of billing administration duties for assigned team(s) globally. Works with contracts admins, field service engineers and regional SPOCs to ensure compliance in the service contracts business. Normally follows established procedures on routine work, requires instructions only on new assignments. Should have exposure around audits and services contracts. Receives assignments in the form of objectives with goals and the process by which to meet goals. Maintains the confidentiality of sensitive and proprietary technical, financial, and commercial information. Prior experience & Knowledge of SAP and IFS Should have required knowledge on commercial aspects of Proposals, Procurements, Contracts, and closures. Applies acquired job skills and company policies and procedures to complete assigned tasks Extract data & publish necessary reports required by the various regional stakeholders as per the established cadence. Contribute to Root cause analysis for any deviation highlighted by the regional stakeholders, required documentation of the said deviations and responding to audit processes as & when required. Use the various official tools available like conference calls & emails for frequent interaction with peers/ customers & regional stakeholders on discussion related to processes, contracts, feedback, presentations & other updates on a regular basis. Interact with vendor regional SPOC and Contracts Team Members to assure project progress to meet customer requirements. Proficiency in MS Office and Quick base Application. Education & Experience The Essentials - You Will Have: Bachelor's Degree in Science/ Commerce/ Business Administration or equivalent. 3+ years of experience in Business Operations. Should have exposure to Global Work style, engagement with clients while sitting at remote locations. Willing to work in flexible business shifts including NA/LA time zone to drive business stakeholder connect. Lean Six Sigma certifications are a plus. The Preferred - You Might Also Have Act as a Point of Contact for acknowledging & addressing internal customers queries related to Projects & Contracts. Participate & actively contribute to continuous improvement initiatives & reporting/documenting enhancements to improve productivity. Interpersonal Regularly interacts with project/contract administrators, Regional SPOCs, and other Business Unit stakeholders to build productive internal/external working relationships. Strong passion for delivering excellent customer experience. Excellent communicator at all levels (in person, written, telephone) with strong ability to clearly articulate & convey the understanding to peers & customers. Reports to : Team Lead Keywords SAP, Supply Chain Management, Project Tracking, Service Management, Order Management, Order Processing, Audits, Invoice Processing, Material Tracking, Customer Relationship Management, Microsoft Office Tools. What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading recruitment firm in India, dedicated to connecting top talent with industry-leading companies. We focus on understanding the unique needs of each client, providing tailored HR solutions that enhance their workforce capabilities. Our mission is to empower organizations by bridging the gap between talent and opportunity. We value integrity, collaboration, and excellence in service delivery, ensuring a seamless experience for both candidates and employers. Job Title: PySpark Data Engineer Work Mode: On-Site Location: India Role Responsibilities Design, develop, and maintain data pipelines using PySpark. Collaborate with data scientists and analysts to gather data requirements. Optimize data processing workflows for efficiency and performance. Implement ETL processes to integrate data from various sources. Create and maintain data models that support analytical reporting. Ensure data quality and accuracy through rigorous testing and validation. Monitor and troubleshoot production data pipelines to resolve issues. Work with SQL databases to extract and manipulate data as needed. Utilize cloud technologies for data storage and processing solutions. Participate in code reviews and provide constructive feedback. Document technical specifications and processes clearly for team reference. Stay updated with industry trends and emerging technologies in big data. Collaborate with cross-functional teams to deliver data solutions. Support the data governance initiatives to ensure compliance. Provide training and mentorship to junior data engineers. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience as a Data Engineer, preferably with PySpark. Strong understanding of data warehousing concepts and architecture. Hands-on experience with ETL tools and frameworks. Proficiency in SQL and NoSQL databases. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Experience with Python programming for data manipulation. Knowledge of data modeling techniques and best practices. Ability to work in a fast-paced environment and juggle multiple tasks. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Experience in Agile methodologies and practices. Knowledge of data governance and compliance standards. Familiarity with BI tools such as Tableau or Power BI is a plus. Skills: data modeling,python programming,pyspark,bi tools,sql proficiency,sql,cloud technologies,nosql databases,etl processes,data warehousing,agile methodologies,cloud computing,data engineer Show more Show less

Posted 1 week ago

Apply

Exploring Extract Jobs in India

The extract job market in India is growing rapidly as companies across various industries are increasingly relying on data extraction to make informed business decisions. Extract professionals play a crucial role in collecting and analyzing data to provide valuable insights to organizations. If you are considering a career in extract, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for extract professionals in India varies based on experience. Entry-level professionals can expect to earn around INR 3-5 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 10 lakhs per annum.

Career Path

In the field of extract, a typical career path may include roles such as Data Analyst, Data Engineer, and Data Scientist. As professionals gain experience and expertise, they may progress to roles like Senior Data Scientist, Data Architect, and Chief Data Officer.

Related Skills

In addition to expertise in data extraction, professionals in this field are often expected to have skills in data analysis, database management, programming languages (such as SQL, Python, or R), and data visualization tools.

Interview Questions

  • What is data extraction, and why is it important? (basic)
  • Can you explain the difference between structured and unstructured data? (basic)
  • How do you ensure the quality and accuracy of extracted data? (medium)
  • What tools or software have you used for data extraction in the past? (medium)
  • Can you walk us through a challenging data extraction project you worked on? (medium)
  • How do you handle missing or incomplete data during the extraction process? (medium)
  • What are some common challenges faced during data extraction, and how do you overcome them? (medium)
  • Explain the process of data cleansing after extraction. (medium)
  • How do you stay updated with the latest trends and technologies in data extraction? (medium)
  • Can you provide an example of a successful data extraction project you led? (advanced)
  • How do you approach data extraction from multiple sources with different formats? (advanced)
  • Explain the role of metadata in data extraction. (advanced)
  • How do you ensure data security and privacy during the extraction process? (advanced)
  • What are the key factors to consider when designing a data extraction strategy for a large dataset? (advanced)
  • How do you handle scalability issues in data extraction processes? (advanced)
  • Explain the concept of incremental data extraction. (advanced)
  • How do you measure the performance and efficiency of data extraction processes? (advanced)
  • Can you discuss the role of data governance in data extraction? (advanced)
  • How do you handle data extraction for real-time analytics? (advanced)
  • What are the best practices for data extraction in a cloud environment? (advanced)
  • How do you ensure data integrity and consistency during the extraction process? (advanced)
  • Can you explain the difference between ETL and ELT processes in data extraction? (advanced)
  • Describe a time when you had to troubleshoot a data extraction issue. (advanced)
  • How do you collaborate with other teams (such as data engineering or business analytics) during the data extraction process? (advanced)

Closing Remark

As you prepare for your career in extract roles, remember to showcase not only your technical skills but also your problem-solving abilities and communication skills during interviews. With the right preparation and confidence, you can excel in the extract job market in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies