Home
Jobs

2987 Extraction Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Linkedin logo

Job Description We are looking for a Business Data Migration Expert for Procurement on the LDC ERP program who will ensure data is delivered according to global, deployment, and country requirements and timelines, overseeing data migration activities for assigned data objects, including but not limited to Vendor Master, Purchase Orders, Purchase Info Records, and Source Lists in both Direct and Indirect Procurement, while acting as the functional Single Point of Contact for data migration on assigned objects for each release. Tasks And Responsibilities Ensure timely completion of data cleansing activities. Collect data for manual and construction-related objects within agreed timelines. Collaborate with IT counterparts to prepare and validate value mappings. Create and maintain master data lists for assigned objects, where applicable. Provide business context and relevant information to technical teams to support data extraction and conversion from legacy systems for assigned objects. Work closely with IT teams and Country Business Data Owners to identify the data objects in scope for each country. Oversee data readiness and continuously verify data quality throughout the data lifecycle for assigned objects. Ensure data is fit for purpose and aligned with both internal and external stakeholder requirements. Review and formally approve data upload files before and after loading for all assigned objects. Perform manual data entry into target systems for applicable objects, when required. Carry out dual maintenance of data as necessary. Execute and approve data verification scripts to ensure data accuracy and integrity. Serve as the Single Point of Contact (SPoC) for assigned objects during the defect management process throughout Hypercare. Requirements 5+ years of experience in both country-specific and global roles, ideally with ERP SAP project implementation experience - Proven track record in data migration projects, with strong knowledge of business processes in the area of Vendor Master and Procurement Data- Demonstrated ability to complete data cleansing and data loads for relevant data objects in alignment with regional project timelines and migration schedules Proven capability to resolve major data quality issues related to assigned data objects within defined project timelines Strong communication and negotiation skills, with the ability to engage effectively with diverse stakeholders across both business and technical teams A strategic yet hands-on approach to work, capable of independently managing tasks such as running meetings, tracking progress, and addressing issues Proficient in Microsoft Excel, with strong analytical skills to manage and interpret complex datasets Timely completion of data cleansing for relevant data objects in line with the given data migration schedule Successful completion of data loads for relevant data objects on projects, ensuring alignment with the migration schedule Resolution of major data quality issues related to assigned data objects, meeting the defined project timeline A bachelor's or master's degree is preferred, reflecting a strong academic foundation. Fluent in English; knowledge of additional languages is considered an asset Candidates must declare Criminal record extract not older than three months (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description and Requirements "At BMC trust is not just a word - it's a way of life!" Description And Requirements CareerArc Code CA-DN Hybrid "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! BU Description We are the Technology and Automation team that drives competitive Advantage for BMC by enabling recurring revenue growth, customer centricity, operational efficiency and transformation through actionable insights, focused operational execution, and obsessive value realization. About You You are a self-motivated, proactive individual who thrives in a fast-paced environment. You have a strong eagerness to learn and grow, continuously staying updated with the latest trends and technologies in data engineering. Your passion for collaboration makes you a valuable team player, contributing to a positive work culture while also guiding and mentoring junior team members. You’re excited about problem-solving and have the ability to take ownership of projects from start to finish. With a keen interest in data-driven decision-making, you are ready to work on cutting-edge solutions that have a direct impact on the business. Role And Responsibilities As a Data Engineer, you will play a crucial role in leading and managing strategic data initiatives across the business. Your responsibilities will include: Leading data engineering projects across key business functions, including Marketing, Sales, Customer Success, and Product R&D. Developing and maintaining data pipelines to extract, transform, and load (ETL) data into data warehouses or data lakes. Designing and implementing ETL processes, ensuring the integrity, scalability, and performance of the data architecture. Leading data modeling efforts, ensuring that data is structured for optimal performance and that security best practices are maintained. Collaborating with data scientists, analysts, and stakeholders to understand data requirements and provide valuable insights across the customer journey. Guiding and mentoring junior engineers, providing technical leadership and ensuring best practices are followed. Maintaining documentation for data structures, ETL processes, and data lineage, ensuring clarity and ease of understanding across the team. Developing and maintaining data security, compliance, and retention protocols as part of best practice initiatives. Professional Expertise Must-Have Skills 5+ years of experience in data engineering, data warehousing, and building enterprise-level data integrations. Proficiency in SQL, including query optimization and tuning for relational databases (Snowflake, MS SQL Server, RedShift, etc.). 2+ years of experience working with cloud platforms (AWS, GCP, Azure, or OCI). Expertise in Python and Spark for data extraction, manipulation, and data pipeline development. Experience with structured, semi-structured, and unstructured data formats (JSON, XML, Parquet, CSV). Familiarity with version control systems (Git, Bitbucket) and Agile methodologies (Jira). Ability to collaborate with data scientists and business analysts, providing data support and insights. Proven ability to work effectively in a team setting, balancing multiple projects, and leading initiatives. Nice-to-Have Skills Experience in the SaaS software industry. Knowledge of analytics governance, data literacy, and core visualization tools (Tableau, MicroStrategy). Familiarity with CRM and marketing automation tools (Salesforce, HubSpot, Eloqua). Education Bachelor’s or master’s degree in computer science, Information Systems, or a related field (Advanced degree preferred). BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 2,033,200 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Analyst - Insurance Domain. Duration : 6 Months. Employment Type : Contractual. Work Location : Gurugram & Bangalore (Priority) | Chennai, Pune, Mumbai (Secondary). Job Description We are looking for an experienced Data Analyst with strong domain knowledge in Insurance and expertise in handling end-to-end data workflows. The ideal candidate should have hands-on experience in data modelling, data analysis, data architecture, and data visualization, along with advanced skills in modern data in tools such as Azure, Python, Spark, and PySpark is required to deliver insights that support strategic business decisions in the insurance sector. Key Responsibilities Analyze large, complex datasets to identify trends, patterns, and insights relevant to the insurance business. Design and implement data models to support analytical and operational reporting needs. Build and maintain scalable data architectures using cloud platforms such as Azure. Develop efficient data pipelines and ETL processes using Python, Spark, and PySpark. Apply domain expertise to validate and ensure data accuracy, relevance, and usability. Create clear and insightful data visualizations and dashboards using open-source or enterprise tools (excluding Power BI). Collaborate with stakeholders to translate business requirements into analytical solutions. Ensure best practices in data governance, security, and documentation. Key Skills Required 6+ years of experience as a Data Analyst. 3- 4 years of hands-on experience in the Insurance domain. Expertise in Data Modelling, Data Analysis, and Data Architecture. Proficiency in Azure, Python, Spark, and PySpark. Strong SQL skills for data extraction, transformation, and analysis. Experience with Data Visualization using tools (excluding Power BI). Excellent communication and stakeholder management skills. Strong analytical thinking and problem-solving abilities. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Role Overview We are looking for a skilled Data Scientist with expertise in data analytics, machine learning, and AI to join our team. Data Analysis & Visualization The ideal candidate will have a strong command of data tools, programming, and knowledge of LLMs and Generative AI, contributing to the growth and automation of our business Responsibilities : Develop and manage data pipelines, ensuring data accuracy and integrity. - Design and implement insightful dashboards using Power BI to help stakeholders make data- driven decisions. Extract and analyze complex data sets using SQL to generate actionable Learning & AI Models : Build and deploy machine learning models to optimize key business functions like discount management, lead qualification, and process automation. Apply Natural Language Processing (NLP) techniques for text extraction, analysis, and classification from customer documents. Implement and fine-tune Generative AI models and large language models (LLMs) for various business applications, including prompt engineering for automation & Innovation : Use AI to streamline document verification, data extraction, and customer interaction processes. Innovate and automate manual processes, creating AI-driven solutions for internal teams and customer-facing systems. Stay abreast of the latest advancements in machine learning, NLP, and generative AI, applying them to real-world business : Bachelor's or Masters degree in Computer Science, Data Science, Statistics, or related field. 4-7 years of experience as a Data Scientist, with proficiency in Python, SQL, Power BI, and Excel. Expertise in building machine learning models and utilizing NLP techniques for text processing and automation. Experience in working with large language models (LLMs) and generative AI to create efficient and scalable solutions. Strong problem-solving skills, with the ability to work independently and in teams. Excellent communication skills, with the ability to present complex data in a simple, actionable way to non-technical stakehold (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

0 years

1 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Seventh Triangle Consulting - Job Description Job Title Outbound Marketing Intern Location: Noida Type: Internship (Full Time / From Office) Stipend: INR 10,000/month Duration: 2 Months (Full-Time offer based on performance) About Us Seventh Triangle started in 2018 as Direct to Consumer enabler and Digital Transformation Agency. It was founded by a team who have been successful DTC Entrepreneurs themselves. We help Brands achieve Revenue & Profitability growth using Data, Technology and Marketing interventions. Seventh Triangle also happens to be a Shopify Plus Partner in India which allows us to work with enterprise brands Jockey, Titan, Nykaa, V-Guard and many more. With a team size of over 120 across two locations (Noida & Bengaluru), Seventh Triangle is a preferred partner to work with in the Indian D2C and Shopify space. Job Brief We are seeking a motivated Outbound Marketing Intern to assist in implementing our outbound marketing strategy. This is an excellent opportunity for someone looking to gain hands-on experience in marketing and business development. Key Responsibilities Assist in the execution of email marketing campaigns. Utilize digital marketing channels, including Email, and LinkedIn to engage potential customers and generate leads. Perform limited copywriting and editing of email content to fit communication requirements and goals. Manage and track the success of marketing campaigns, analyzing performance metrics. Conduct market research to identify potential target audiences. Ideate with Content Team to create marketing materials and outreach content. Maintain accurate records of campaign performance and reporting. Refine communication strategy based on data insights. Participate in brainstorming sessions and contribute creative ideas. Determine new and innovative ways to Improve click-through-rates (CTR). Requirements Currently pursuing or recently completed a degree in Marketing, Business, or a related field. Strong written and verbal communication skills with an eye for detail. Ability to work independently and as part of a team Proficiency in Microsoft Office, Google services, etc. Eagerness to learn and adapt in a fast-paced environment. Experience with lead generation and outbound marketing is a plus Note To clarify, this is not a traditional sales internship. You won’t be required to do field sales or make cold calls. Instead, you'll support the sales team through tasks like market research, lead data extraction, LinkedIn and Email outreach. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Attero Recycling Private Limited is a NASA-recognized metal extraction company and end-to-end recycler of Li-Ion Batteries and E-waste headquartered in Noida and a manufacturing facility in Roorkee, Uttarakhand. Attero Recycling Private Limited is amongst a handful of elite organizations globally, with the capability to extract pure metals like Lithium, Cobalt, Titanium, Nickle, Manganese, Graphite, Gold, Copper, Palladium, etc from end-of-life electronics and Lithium-ion batteries. The company is now in process of global expansion and setting up operations in India, Europe, and North America. Given the pace at which the company wants to grow, it expects employees to go beyond their defined roles to accomplish results, cooperate and collaborate with other team members, and are willing to apply innovation, and new ideas and take calculated risks like an entrepreneur. Key Responsibilities Lead a team of fullstack developers, providing technical guidance, mentorship, and support. Collaborate with product managers, designers, and other stakeholders to define project requirements and timelines. Architect, design, and implement scalable and maintainable solutions for our web applications. Review code, provide feedback, and ensure that best practices and coding standards are followed. Oversee the entire software development lifecycle, from planning and design to testing and deployment. Manage project priorities, deadlines, and resources to meet business objectives. Identify and address technical debt, performance bottlenecks, and other issues that may arise. Stay up-to-date with the latest technologies, trends, and best practices in web development. Foster a collaborative and innovative team culture, promoting knowledge sharing and continuous improvement. Act as a technical liaison between the development team and other departments within the company. Requirements Tech, MCA. 5-8 years of experience as a Fullstack Developer, with at least 2 years in a leadership or managerial role. Proficiency in front-end technologies such as HTML5, CSS3, JavaScript, and modern JavaScript frameworks (e., React, Angular). Strong understanding of back-end technologies such as Node.js, Python, or similar. Experience with database systems such as MySQL, MongoDB. Knowledge of RESTful APIs and web services. Solid understanding of software architecture, design patterns, and best practices. Experience with Agile development methodologies and tools such as Jira. Excellent leadership, communication, and interpersonal skills. Ability to prioritize tasks, delegate responsibilities, and manage multiple projects simultaneously. Technical Expertise 5+ years of Python development experience. Experience with the following Python libraries: FastAPI, Flask, or Django. Pandas. 5+ years of front-end development experience with JavaScript frameworks. 5+ years of experience with SQL. Experience working with structured and unstructured data. Experience with Linux. Experience with AWS. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. Position : SE/ Senior Data Engineer (with SQL, Python, Airflow, Bash) About The Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. We are specifically looking for someone with strong hands-on experience in SQL, Python, and ideally Airflow and Bash scripting. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks using tools like SQL, Python, and Airflow. Build and automate data workflows and orchestration pipelines; knowledge of Airflow or equivalent tools is a plus. Write and maintain Bash scripts for automating system tasks and managing data jobs. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Must have B.Tech or B.E degree in Computer Science, Information Systems, or any related field. 3+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proficiency in SQL and Python for data processing and pipeline development. Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Career Area: Technology, Digital and Data Job Description: Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you're joining a global team who cares not just about the work we do – but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here – we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Role Definition Performs implementation, regular problem solving, maintenance and support for a agile software development. Responsibilities Designing, modifying, developing, writing and implementing software programming applications for target system using agile methods. Acquiring client requirements; resolving workflow problems through automation optimization. Writing source codes for new applications, and/or generating and enhancing code samples for existing applications. Utilizing automated testing tools to perform the testing and maintenance. Skill Descriptors Decision Making and Critical Thinking: Knowledge of the decision-making process and associated tools and techniques; ability to accurately analyze situations and reach productive decisions based on informed judgment. Level Working Knowledge: Applies an assigned technique for critical thinking in a decision-making process. Identifies, obtains, and organizes relevant data and ideas. Participates in documenting data, ideas, players, stakeholders, and processes. Recognizes, clarifies, and prioritizes concerns. Assists in assessing risks, benefits and consideration of alternatives. Effective Communications: Understanding of effective communication concepts, tools and techniques; ability to effectively transmit, receive, and accurately interpret ideas, information, and needs through the application of appropriate communication behaviors. Level Working Knowledge: Delivers helpful feedback that focuses on behaviors without offending the recipient. Listens to feedback without defensiveness and uses it for own communication effectiveness. Makes oral presentations and writes reports needed for own work. Avoids technical jargon when inappropriate. Looks for and considers non-verbal cues from individuals and groups. Software Development: Knowledge of software development tools and activities; ability to produce software products or systems in line with product requirements. Level Extensive Experience: Conducts walkthroughs and monitors effectiveness and quality of the development activities. Elaborates on multiple-development toolkits for traditional and web-based software. Has participated in development of multiple or large software products. Contrasts advantages and drawbacks of different development languages and tools. Estimates and monitors development costs based on functional and technical requirements. Provides consulting on both selection and utilization of developers' workbench tools. Software Development Life Cycle: Knowledge of software development life cycle; ability to use a structured methodology for delivering and managing new or enhanced software products to the marketplace. Level Working Knowledge: Describes similarities and differences of life cycle for new product development vs. new release. Identifies common issues, problems, and considerations for each phase of the life cycle. Works with a formal life cycle methodology. Explains phases, activities, dependencies, deliverables, and key decision points. Interprets product development plans and functional documentation. Software Integration Engineering: Knowledge of software integration processes and functions; ability to design, develop and maintain interfaces and linkage to alternative platforms and software packages. Level Working Knowledge: Has experience with designing data exchange interfaces to and from software product. Describes tools and techniques for extraction, transformation and loading of electronic data. Cites examples of common linkage requirements for software products and vendors. Works with integrating software into the customer or partner framework and infrastructure. Participates in the development of technology interfaces and bridges. Software Product Design/Architecture: Knowledge of software product design; ability to convert market requirements into the software product design. Level Extensive Experience: Demonstrates experience with the architecture and design of major or multiple products. Describes major software architecture alternatives and considerations. Explains design considerations for commercial database systems, operating systems and web. Displays experience in estimating the cost of a specific design of a proposed product. Facilitates design reviews and walkthroughs. Analyzes benefits and drawbacks of specific software designs and architecture. Software Product Technical Knowledge: Knowledge of technical aspects of a software products; ability to design, configure and integrate technical aspects of software products. Level Working Knowledge: Maintains and utilizes data related to install base configurations and environments. Solicits customer feedback; reports and monitors bugs and implementation issues. Participates in defining and conducting technical acceptance tests. Participates in creating technical requirements for software development and deployment. Explains basic environment and product configuration options. Software Product Testing: Knowledge of software product testing; ability to design, plan, and execute testing strategies and tactics to ensure software product quality and adherence to stated requirements. Level Working Knowledge: Participates in test readiness reviews, functional, volume, and load testing. Describes key features and aspects of a specific testing discipline or methodology. Tests software components for compliance with functional requirements and design specifications. Explains procedures for documenting test activities and results (e.g. errors, non-conformance, etc.) Conducts functional and performance testing on aspects of assigned products. Responsibilities: Top Candidates will have : Data Pipeline Development: Design, develop, and maintain scalable and efficient ETL/ELT pipelines using Python and AWS services (e.g., Lambda, Glue, S3, EC2, Step Functions). Data Warehousing: Architect, implement, and optimize data models and queries within Snowflake, ensuring optimal performance and scalability. Cloud Infrastructure: Manage and maintain data infrastructure on AWS, including provisioning, monitoring, and troubleshooting. Data Quality and Governance: Implement data quality checks and monitoring processes to ensure data integrity and reliability. Performance Optimization: Identify and resolve performance bottlenecks in data pipelines and queries. Collaboration: Work closely with data scientists, analysts, and other engineers to understand data requirements and deliver effective solutions. Automation: Automate data engineering tasks and processes to improve efficiency and reduce manual effort. Documentation: Create and maintain comprehensive documentation for data pipelines, infrastructure, and processes. Mentorship: Mentor junior data engineers and contribute to the team's knowledge sharing and best practices. Security: Implement and maintain data security best practices. Required Skills And Experience: Expertise in Python programming for data processing and automation. Strong proficiency in SQL for data manipulation and analysis. Extensive experience with AWS cloud services, particularly those related to data engineering (e.g., S3, Glue, Lambda, EC2, IAM, Step Functions, CloudWatch). Proven experience with Snowflake data warehousing. Experience designing and implementing ETL/ELT pipelines. Strong understanding of data modeling concepts and best practices. Experience with version control systems (e.g., Git). Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Experience with data orchestration tools like Airflow or similar. Preferred Skills: Experience with data streaming technologies (e.g., Kafka, Kinesis). Experience with infrastructure as code (IaC) tools (e.g., Terraform, CloudFormation). Experience with building data lakes. Experience with CI/CD pipelines in AZDO. This Job Description is intended as a general guide to the job duties for this position and is intended for the purpose of establishing the specific salary grade. It is not designed to contain or be interpreted as an exhaustive summary of all responsibilities, duties and effort required of employees assigned to this job. At the discretion of management, this description may be changed at any time to address the evolving needs of the organization. It is expressly not intended to be a comprehensive list of “essential job functions” as that term is defined by the Americans with Disabilities Act. Posting Dates: June 10, 2025 - June 23, 2025 Caterpillar is an Equal Opportunity Employer. Not ready to apply? Join our Talent Community. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Develop and maintain Python scripts for extracting and analyzing metadata from MicroStrategy reports and REST APIs. Automate data extraction and processing tasks, and ingesting the data into relational database Ensure the accuracy and reliability of the automation scripts. Work with the MicroStrategy BI engineer to validate the to Have : Strong Python programming skills with strong FastAPI knowledge. Experience with creating and managing Restfull APIs Experience with Microsoft Azure with WebApps managment Experience with data analysis and automation. Knowledge of MicroStrategy architecture (for report parsing). good to have Maintaining the code in Git / Dev Ops (Must Have) : Python , FastAPI Libraries for data manipulation and API interaction. Dev ops / Git (Good to Have) : MicroStrategy architetcure and API Cloud platforms. Microsoft Azure Good problem-solving skills SQL knowledge with PostgreSQL (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Summary Netoyed is seeking a highly skilled Data Integration and Power BI Developer. This Developer will be responsible for quickly connecting four external APIs, building Azure Data Factory pipelines for data ingestion and transformation, and delivering interactive dashboards and reports in Power BI. This role is essential to meeting project milestones for Juniper Landscapings Power BI Integration initiative. Key Responsibilities API Integration and Data Ingestion Develop and configure secure connections to the following APIs : Paycom Acumatica Aspire Procuify Handle data extraction (structured and unstructured) and secure processing. Data Pipeline Development Build scalable ETL pipelines using Azure Data Factory (ADF). Implement automation for continuous data refresh and ingestion. Ensure proper data validation, error handling, and pipeline optimization. Dashboard And Report Creation Design and build operational, analytical, and executive dashboards in Power BI. Implement dynamic visualizations, KPIs, drill-throughs, and real-time updates. Apply role-based security and user access controls within Power BI. Project Collaboration Work closely with Netoyed project leads to ensure deliverables align with client expectations. Provide regular updates on progress, blockers, and completion milestones. Deliver technical documentation for API integrations, pipelines, and dashboards. Required Skills And Experience 3+ years professional experience with Azure Data Factory and data ingestion projects. 3+ years building dashboards and reports in Power BI. Proven ability to connect to REST/SOAP APIs, process JSON/XML responses securely. Strong skills in Azure SQL Database, Azure Data Lake Storage, and data processing. Proficiency with DAX, Power Query (M language), and data modeling best practices. Excellent troubleshooting, problem-solving, and documentation skills. Ability to work independently and meet aggressive short-term deadlines. Preferred Qualifications Experience with Azure DevOps for CI/CD of data pipelines and reporting solutions. Familiarity with Microsoft Fabric or Lakehouse architecture is a plus. Knowledge of data integration from HR, financial, or operational systems (e.g., Paycom, Acumatica). Microsoft certifications such as Azure Data Engineer Associate or Power BI Data Analyst are a plus. Job Details Start Date : Immediate to within 2 weeks. Hours : Estimated 3040 hours per week (project workload driven). Work Mode : Netoyed office, US EST hours. Tools Provided : Access to required Azure resources, API documentation, and Power BI workspace environments. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Job Summary We are actively seeking an experienced and highly skilled Application Developer with strong proficiency in C/C++ to join our innovative software development team. This role is crucial for designing and developing robust software solutions primarily focused on equipment control, efficient server-side management, and advanced image reading applications. The ideal candidate will be a collaborative team player, working closely with cross-functional teams to deliver high-performance, reliable, and scalable software for complex industrial and automation systems. This is a remote position within India, with the potential for overseas travel based on specific project needs. Key Responsibilities C/C++ Application Development : Design, develop, and maintain high-quality, high-performance applications primarily using C and C++ programming languages. Implement robust, scalable, and efficient code for critical industrial and automation Control System Development : Design and implement intuitive and functional user interfaces (UIs) for precise equipment control systems. Develop software modules for controlling, monitoring, and interacting with various industrial machinery and Reading & Analysis Applications : Work on software components dedicated to advanced image reading, processing, and analysis. Develop algorithms and functionalities for tasks such as image acquisition, feature extraction, pattern recognition, and data interpretation from visual Management : Develop and manage server-side functions to ensure reliable, scalable, and secure operation of industrial and automation systems. Implement data storage, retrieval, processing, and communication protocols for back-end Integration & Support : Ensure seamless integration of software applications with various hardware components and devices. Provide essential onsite technical support if required, including troubleshooting, system diagnostics, and problem resolution at client locations or industrial Collaboration : Collaborate effectively with hardware engineers, system architects, QA teams, and other stakeholders to ensure integrated solutions and successful project : Bachelors or Masters degree in Computer Science, Software Engineering, Electronics Engineering, or a closely related technical : 2 to 6 years of relevant, hands-on experience in application development using Skills : Strong proficiency in C and C++ programming languages. Solid understanding of Object-Oriented Programming (OOP) principles and design patterns. GUI Frameworks : Experience with common GUI (Graphical User Interface) frameworks (e.g., Qt, MFC, GTK+, WxWidgets) for developing user interfaces. Hardware Interaction : Proven experience with hardware control interfaces and communicating with various industrial devices or but Desirable : Exposure to image processing libraries (e.g., OpenCV) and concepts. Familiarity with server management tools, network programming, and communication protocols (e.g., TCP/IP, Modbus, OPC UA). Experience with multi-threading and concurrent programming. Problem-Solving : Strong debugging, analytical, and problem-solving skills with the ability to diagnose and resolve complex software issues. Travel Readiness : Willingness and ability to travel overseas for project deployments, client support, or specialized training as required. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About The Job The ideal candidate is a hands-on technology developer with experience in developing scalable applications and platforms. They must be at ease working in an agile environment with little supervision. The person should be a self-motivated person with a passion for problem solving and continuous learning. Role And Responsibilities Strong technical, analytical, and problem-solving skills Strong organizational skills, with the ability to work autonomously as well as in a team-based environment Data pipeline framework development Technical Skills Requirements The candidate must demonstrate proficiency in data processing and extraction Ability to own and deliver on large, multi-faceted projects Fluency in complex SQL and experience with RDBMSs Project Experience in Spark, PySpark, Scala, Python, NiFi, Hive, NoSql DBs) Experience designing and building big data pipelines Experience working on large scale, distributed systems Experience working on any Bigquery would be added advantage Strong hands-on experience of programming language like PySpark, Scala with Spark, Python. Exposure to various ETL and Business Intelligence tools Experience in shell scripting to automate pipeline execution. Solid grounding in Agile methodologies Experience with git and other source control systems Strong communication and presentation skills Nice-to-have Skills Experience in GTM, GA4 and Firebase Bigquery certification Unix or Shell scripting Strong delivery background across the delivery of high-value, business-facing technical projects in major organizations Experience of managing client delivery teams, ideally coming from a Data Engineering / Data Science environment API development Qualifications B.Tech./M.Tech./MS or BCA/MCA degree from a reputed university Looking for only 30 days' notice period or less than that (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are Youre a data-savvy Product Manager who thrives at the intersection of data analytics, insurance product design, and platform integration. You bring structure to ambiguity and deliver high-impact data solutions that power life & annuity platforms. What Youll Do Own and drive the full product lifecycle for data productsfrom research, market analysis, and roadmap planning to requirements definition and launch. Collaborate with Engineering, QA, Business, and Client Services to define technical solutions, estimate effort, plan releases, and manage sprint backlogs. Design and customize data layouts (e.g., Data Warehouse, Reserve, Tax, NCOA, Hedge) based on Zinnia standards and client-specific requirements. Translate business and regulatory requirements into clear technical specifications and user stories, leveraging SQL (Oracle/MS SQL Server) and APIs. Act as a key liaison across SDLC stakeholders. Lead QA and production deployment activities including FTEV validation, defect triage, and post-migration support. Perform root cause analysis for incidents, document impact and resolution, and implement preventative actions. Identify product gaps and growth opportunities through research on market trends, platform usage, and client feedback. Support strategic decisions with analysis on ROI, consumer behavior, build-vs-buy assessments, and platform performance. Maintain clear, detailed documentation (e.g., KT decks, data dictionaries, reference guides) and deliver regular training sessions. Communicate product and incident status updates effectively to leadership and client stakeholders. What Youll Need Bachelors or Masters degree with 4+ years of experience in Data Product Management, Business Analysis, or a similar data-centric role. Strong hands-on expertise in SQL (Oracle and MS SQL Server) for data extraction, transformation, and analysis. Proven experience working with API integrations and data interface design in enterprise systems. Solid understanding of system architectures, client/server models, and data integration frameworks. Familiarity with business process modeling and workflow tools, including decision tables, data flow diagrams, and system mapping. Excellent collaboration and stakeholder management skills, with the ability to translate business requirements into technical solutions across product and engineering teams. Strong written, oral, and interpersonal communication skills. Willingness to develop industry-related knowledge. Analytical, organized with excellent written and verbal communication skills. Must be a self-starter, flexible and motivated. Great to have experience in BFSI Industry across Annuity, mutual funds, financial services, or life insurance. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Overview : . As a Product Data Analyst at SMS Magic, you will play a crucial role in driving product insights through the extraction, manipulation, and analysis of large datasets. You will develop and maintain interactive dashboards, analyze data from Google Analytics 4, and collaborate closely with Product Managers and various stakeholders. Your ability to manage multiple tasks, deliver actionable insights, and support data-driven decision-making processes will be key to your success in this role. Key Responsibilities Data Extraction and Analysis : Utilize SQL and databases such as Redshift to extract, manipulate, and analyze large datasets to drive product insights. Google Analytics : Analyze and interpret data from Google Analytics 4 to understand user behavior, product performance, and identify opportunities for improvement. Collaboration : Work closely with Product Managers to define product metrics, set goals, and track performance. Multitasking : Manage multiple tasks simultaneously, ensuring timely delivery of insights and analyses. Stakeholder Engagement : Collaborate with various teams including engineering, marketing, and sales to gather requirements and deliver actionable insights. Presentations : Present findings and recommendations to stakeholders, supporting data-driven decision-making processes. Qualifications Experience : Minimum of 1 year of experience working in a product company. Technical Skills Proficiency in SQL and experience with databases like Redshift. Hands-on experience with dashboarding tools such as Tableau, Looker, or Power BI. Analytics : Experience working with Google Analytics 4 data. Additional Skills : Basic knowledge of Python is a plus, but not mandatory. Soft Skills Excellent multitasking skills and ability to manage multiple priorities in a fast-paced environment. Strong collaboration skills with a proven track record of working effectively with Product Managers. Exceptional stakeholder management skills, with the ability to communicate complex data insights in a clear and concise manner. What working at SMS Magic Offers? At SMS Magic, people growth is parallel to company's growth and our work culture supports our commitment to creating a world class CRM messaging company. Our work culture is built on high-performance teaming where everyone can achieve their potential and contribute to building a better working world for our people and our clients. We offer a sense of balance, we want our people to be active, healthy, and happy, not just in their jobs but in their lives outside of work. Our competitive compensation package where you'll be rewarded based on your performance and recognized for the value you bring to our business. In addition, we do our best to make your time with us a rewarding learning experience that helps you grow as an individual. Plus, We Offer The freedom and flexibility to handle your role in a way that's right for you. Gain exposure to a dynamic and growing global business environment. Exposure to innovative and cutting-edge technology and tools. Scope to showcase one's analytical capabilities and make high impact contributions to Business teams. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Sadar, Uttar Pradesh, India

On-site

Linkedin logo

GCP Data Engineer We are looking for a GCP Data Engineer to design, develop, and maintain scalable data pipelines and cloud-based data platforms. You will work on building and optimizing data workflows, implementing robust data solutions using Google Cloud Platform (GCP) technologies, and collaborating closely with cross-functional teams to deliver high-impact, data-driven insights. This role requires a deep understanding of data architecture, GCP ecosystem, ETL/ELT processes, and the ability to lead, mentor, and execute with precision. Key Responsibilities Design, build, and maintain robust data extraction, transformation, and loading (ETL/ELT) pipelines across both on-premises and cloud platforms. Develop and support data products, pipelines, and analytical platforms leveraging GCP services. Perform application impact assessments, requirement reviews, and provide accurate work estimates. Create test strategies and implement site reliability engineering (SRE) measures for data systems. Participate in agile development sprints and contribute to solution design reviews. Mentor and guide junior Data Engineers on best practices and design patterns. Lead root cause analysis and resolution of critical data operations and post-implementation issues. Conduct technical data stewardship activities, including metadata management, data security, and privacy-by-design principles. Use Python and GCP technologies to automate data workflows and transformations. Work with SQL for data modeling, transformations, and analytical queries. Automate job scheduling and orchestration using Control-M, Apache Airflow, or Prefect. Write Unix shell scripts to support automation and monitoring of data operations. Support BI/analytics teams with structured and well-modeled data. Use Infrastructure as Code (IaC) tools like Terraform, Ansible, or Puppet for automated deployments and configuration management. Required Skills & Technologies Strong experience with Python, SQL, and Unix/Linux scripting. Proficient in GCP Data Services. Experience in designing and managing ETL/ELT pipelines across hybrid environments. Working knowledge of orchestration tools: Apache Airflow, Control-M, or Prefect. Understanding of modern data warehousing and cloud-based analytics architecture. Familiarity with Infrastructure-as-Code using Terraform, Puppet, or Ansible. Strong debugging and problem-solving abilities in complex data environments. Ability to work in Agile teams and deliver in short sprint cycles. Qualifications Bachelors degree in Computer Science, Software Engineering, Data Science, Mathematics, or related field. 4+ years of hands-on experience in data engineering. 2+ years of experience in data architecture and solution design. GCP Certified Data Engineer certification is preferred. Excellent communication skills and the ability to collaborate with cross-functional teams. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title : BI Engineer (Power BI Developer) Experience : 5+ Years Location : Hyderabad Job Type: Full-Time Responsibilities Design, develop, and deploy interactive and visually appealing BI dashboards and reports using Microsoft Power BI (Desktop, Service, Embedded). Connect to various data sources (databases, APIs, flat files) and implement efficient data extraction, transformation, and loading (ETL) processes. Write and optimize complex SQL queries to extract, manipulate, and analyze data from diverse database systems. Leverage expertise in reporting tools to select the most appropriate visualization techniques and create user-friendly and insightful reports. Utilize Python for data analysis, data manipulation, automation of tasks, and integration with other data platforms. Employ scripting tools and languages (PowerShell, Bash) to automate repetitive tasks and streamline data workflows. Develop a strong understanding of various database systems, including relational databases (e.g., SQL Server, PostgreSQL, MySQL) and potentially cloud-based data warehouses (e.g., Azure Synapse Analytics, Snowflake). Work extensively with Azure DevOps for version control (managing branches and pull requests), collaboration, and potentially for setting up CI/CD pipelines for BI solutions. Collaborate closely with business analysts, data scientists, and other stakeholders to understand their reporting requirements and translate them into technical specifications. Ensure data accuracy, integrity, and consistency across all BI solutions. Optimize Power BI reports and dashboards for performance, scalability, and usability. Develop and maintain data models within Power BI, ensuring efficient data relationships and calculations using DAX (Data Analysis Expressions). Participate in the full BI development lifecycle, from requirements gathering and design to development, testing, and deployment. Stay up-to-date with the latest features, updates, and best practices in Power BI and the broader Microsoft data analytics ecosystem. Create and maintain comprehensive technical documentation for developed BI solutions. Provide support and troubleshooting for existing Power BI reports and dashboards. Required Skills Power BI: Extensive hands-on experience (5+ years) in developing and deploying BI solutions using Microsoft Power BI, including Power BI Desktop, Power BI Service, and potentially Power BI Embedded. SQL: Excellent proficiency in writing complex SQL queries for data extraction, manipulation, and analysis across various database platforms. Experience with query optimization and performance tuning. Reporting Tools Expertise: Proven experience and deep understanding of various reporting and data visualization tools beyond Power BI (e.g., Tableau, Qlik Sense, SSRS) is highly desirable. Python: Good working knowledge of Python for data analysis, data manipulation (using libraries like Pandas, NumPy), and automation of data-related tasks. Familiarity with Scripting Tools: Experience with scripting tools and languages such as PowerShell or Bash for automating tasks related to data management and deployment. Strong Understanding of Databases: Comprehensive understanding of relational database management systems (RDBMS) and data warehousing concepts. Experience working with different database systems (SQL Server, PostgreSQL, MySQL, Azure SQL Database). Experience with Azure DevOps: Proven experience working with Azure DevOps, including managing code repositories (Git), working with branches and pull requests, and ideally setting up and managing CI/CD pipelines for BI : Bachelor's degree in Computer Science, Information Technology, Data Science, Business Analytics, or a related field. Minimum of 5 years of professional experience as a BI Developer with a strong focus on Power BI. Proven track record of successfully designing, developing, and deploying impactful BI solutions. Strong analytical and problem-solving skills with the ability to translate business requirements into technical solutions. Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders. Ability to work independently and collaboratively within a team environment. A strong passion for data and the ability to derive meaningful insights from complex datasets. Bonus Points Experience with other Azure data services such as Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, and Azure Data Lake Storage. Knowledge of data warehousing methodologies (Kimball, Inmon) and dimensional modeling techniques (star schema, snowflake schema). Experience with data governance and data quality processes. Familiarity with other components of the Microsoft Power Platform (Power Apps, Power Automate). Experience working with agile development methodologies. Microsoft Power BI certifications (DA-100: Analyzing Data with Microsoft Power BI). (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We are seeking a talented Computer Vision Engineer with strong expertise in microservice deployment architecture to join our team. In this role, you will be responsible for developing and deploying computer vision models to analyze retail surveillance footage for use cases such as theft detection, employee efficiency monitoring, and store traffic analysis. Responsibilities You will work on designing and implementing scalable, cloud-based microservices to deliver real-time and post-event analytics to improve retail Responsibilities : Develop computer vision models : Build, train, and optimize deep learning models to analyze surveillance footage for detecting theft, monitoring employee productivity, tracking store busy hours, and other relevant use cases. Microservice architecture : Design and deploy scalable microservice-based solutions that allow seamless integration of computer vision models into cloud or on-premise environments. Data processing pipelines : Develop data pipelines to process real-time and batch video data streams, ensuring efficient extraction, transformation, and loading (ETL) of video data. Integrate with existing systems : Collaborate with backend and frontend engineers to integrate computer vision services with existing retail systems such as POS, inventory management, and employee scheduling. Performance optimization : Fine-tune models for high accuracy and real-time inference on edge devices or cloud infrastructure, optimizing for latency, power consumption, and resource constraints. Monitor and improve : Continuously monitor model performance in production environments, identify potential issues, and implement improvements to accuracy and efficiency. Security and privacy : Ensure compliance with industry standards for security and data privacy, particularly regarding the handling of video footage and sensitive : 5+ years of proven experience in computer vision, including object detection, action recognition, and multi-object tracking, preferably in retail or surveillance applications. Hands-on experience with microservices deployment on cloud platforms (e.g., AWS, GCP, Azure) using Docker, Kubernetes, or similar technologies. Experience with real-time video analytics, including working with large-scale video data and camera Skills : Proficiency in programming languages like Python, C++, or Java. Expertise in deep learning frameworks (e.g., TensorFlow, PyTorch, Keras) for developing computer vision models. Strong understanding of microservice architecture, REST APIs, and serverless computing. Knowledge of database systems (SQL, NoSQL), message queues (Kafka, RabbitMQ), and container orchestration (Kubernetes). Familiarity with edge computing and hardware acceleration (e.g., GPUs, TPUs) for running inference on embedded Qualifications : Experience with deploying models to edge devices (NVIDIA Jetson, Coral, etc.) Understanding of retail operations and common challenges in surveillance. Knowledge of data privacy regulations such as GDPR or Skills : Strong analytical and problem-solving skills. Ability to work independently and in cross-functional teams. Excellent communication skills to convey technical concepts to non-technical : Competitive salary and stock options. Health insurance. If you're passionate about creating cutting-edge computer vision solutions and deploying them at scale to transform retail operations, wed love to hear from you! Apply Now. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Join us as a Data Scientist at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of the Service Operations team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Data Scientist you should have experience with: Essential Skills Solid understanding of machine learning concepts and model deployment. Prior experience in a data science role, indicating a strong foundation in the field. Advanced coding proficiency in Python, with the ability to design, test, and correct complex scripts. Proficiency in SQL, for managing and manipulating data. Working in an Agile manner and leading Agile teams using Jira. Some Other Highly Valued Skills Include Excellent modelling skills, as evidenced by an advanced degree or significant experience. Strong quantitative and statistical skills, enabling logical and methodical problem-solving. Good understanding and experience of big data technologies and the underlying approach. Good interpersonal skills for maintaining relationships with multiple business areas, including senior leadership and compliance. Ability to manage laterally and upwards across multiple discipline technical areas. Version control using Bitbucket, Gitlab, etc. Cloud experience (AWS, Azure or GCP) You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To use innovative data analytics and machine learning techniques to extract valuable insights from the bank's data reserves, leveraging these insights to inform strategic decision-making, improve operational efficiency, and drive innovation across the organisation. Accountabilities Identification, collection, extraction of data from various sources, including internal and external sources. Performing data cleaning, wrangling, and transformation to ensure its quality and suitability for analysis. Development and maintenance of efficient data pipelines for automated data acquisition and processing. Design and conduct of statistical and machine learning models to analyse patterns, trends, and relationships in the data. Development and implementation of predictive models to forecast future outcomes and identify potential risks and opportunities. Collaborate with business stakeholders to seek out opportunities to add value from data through Data Science. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job function Customer Delivery Designation Level Employment Type Contract Experience level Workplace Type Onsite Location Mumbai, India - 400101 Must have skills Content Management Systems (CMS) JIRA HoPP Platform (Home Page Production Qualifications Job role About The Opportunity Magnifi is looking for a proactive and detail-oriented Editorial Content Executive to join our team on a 1-year contractual basis , working directly with the Jio Hotstar team . About Videoverse VideoVerse is an innovative and dynamic video technology company that serves as an umbrella brand for our powerful AI-based products; Magnifi & Illusto. We are an enthusiastic, passionate, fast-growing, diverse, and vibrant team that work with some of the biggest names in broadcasting (3 of the top 5 in India and growing quickly in Europe and the USA) and on some of the biggest sporting events in the world like the Indian Premier League (T20IPL), multiple European football leagues, and much more. The company is at a stage of rapid growth and is actively hiring enthusiastic individuals who believe in making a difference and revolutionizing the way content is created, distributed, and consumed in the evolving video-centric world. For More Information, Please Click The Links Mentioned Below Videoverse LinkedIn: https://www.linkedin.com/company/videoverse/ Videoverse: https://vverse.ai/ Magnifi: https://magnifi.ai/ About The Products Magnifi , is an AI-powered enterprise product that automatically detects key moments in video content, enabling real-time creation of highlights and short-form videos. With a global presence, Magnifi collaborates with various industries, including OTT platforms, sports broadcasters, and e-gaming platforms. Their vision is to empower users to create and share impactful stories across digital platforms with ease. Fostering a culture of innovation and collaboration, Magnifi's leadership team is dedicated to leveraging AI for simplified video editing. The company has made notable acquisitions and received recognition for its contributions to the industry. Role And Responsibilities For India: CMS tray creations, set-up, updates & maintenance Metadata changes, Jira ticket requests, on-call for CMS changes Masthead boosting for new & priority releases Editorial Masthead updates for tournament season Page Management on retool HoPP: Prod and Pre-Prod widget & space creation, management and experiments - creation and execution, on-call for home page changes GEC data extraction & curation for channel teams For HSI CMS tray creations, set-up, updates & maintenance Page Management on retool Editorial Masthead upkeep Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Company Description At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. CoreLogic is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity, and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills, and directly impact the insurance marketplace. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property insurance and restoration industry. Regular working hours will be from 12noon to 9pm IST It will be Hybrid work where the team will need to be in the office for the first half of the day for 3 days a week and they can leave in the afternoon around 3pm or 4pm and log back at home for the remaining hours (same as other local teams). Training will be provided about the product. Description We are seeking a highly skilled Data Analyst to join our Analytics Support team to serve customers across the property insurance and restoration industries. As a data analyst you will play a crucial role in developing methods and models to inform data-driven decision processes resulting in improved business performance for both internal and external stakeholder groups. You will be responsible for interpreting complex data sets and providing valuable insights to enhance the value of data assets. The successful candidate will have a strong understanding of data mining techniques, methods of statistical analysis, and data visualization tools. This position offers an exciting opportunity to work in a dynamic environment, collaborating with cross-functional teams to support decision processes that will guide the respective industries into the future. Responsibilities Collaborate with cross-functional teams to understand and document requirements for analytics products. Serve as the primary point of contact for new data/analytics requests and support for customers. Act as the domain expert and voice of the customer to internal stakeholders during the analytics development process. Develop and maintain an inventory of data, reporting, and analytic product deliverables for assigned customers. Work with customer success teams to establish and maintain appropriate customer expectations for analytics deliverables. Create and manage change order tickets on behalf of customers within internal frameworks. Ensure timely delivery of assets to customers and aid in the development of internal processes for the delivery of analytics deliverables. Work with IT/Infrastructure teams to provide customer access to assets and support internal audit processes to ensure data security. Create and optimize complex SQL queries for data extraction, transformation, and aggregation. Develop and maintain data models, dashboards, and reports to visualize data and track key performance metrics. Conduct validation checks and implement error handling mechanisms to ensure data reliability. Collaborate closely with stakeholders to align project goals with business needs and perform ad-hoc analysis to provide actionable recommendations. Analyze large and complex datasets to identify trends, patterns, and insights, and present findings and recommendations to stakeholders in a clear and concise manner. Job Qualifications 5+ years’ experience of building PowerBI dashboards, data modeling and analysis Bachelor’s degree in computer science, data science, statistics, or a related field is preferred. Advanced knowledge of data analysis tools such as Power Query, Excel, and Power BI. Demonstrated expertise in Power BI creating reports and dashboards, including the ability to connect to various data sources, prepare and model data, and create visualizations. Excellent visual and storytelling skills with data. Experience with Power Query for importing, transforming, and shaping data. Expert knowledge of DAX for creating calculated columns and measures to meet report-specific requirements. Proficiency in SQL with the ability to write complex queries and optimize performance. Experience with ETL processes, data pipeline and automation a plus. Strong analytical and problem-solving skills Excellent attention to detail and the ability to work with large datasets. Effective communication skills, both written and verbal. Ability to work independently and collaborate in a team environment. Knowledge of property insurance industry will be a plus. Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Skill required: Tech for Operations - Technology Architecture Designation: App Automation Eng Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? In this role, you will be responsible for designing and architecting end-to-end automation solutions leveraging RPA (primarily A360 and UiPath), OCR, AI/ML, GenAI, and Agentic AI frameworks. You will work closely with clients, business stakeholders, and development teams to ensure the successful implementation of intelligent automation solutions that drive business impact. What are we looking for? ? Primary expertise in RPA (primarily A360 and UiPath), with a strong foundation in automation design and implementation ? Experience with OCR / GenAI based OCR (Document Automation/Document Understanding) for structured and unstructured data extraction ? Proficiency in Python for scripting, automation, and AI/ML model integration ? Strong knowledge of GenAI, Machine Learning, and Agentic AI frameworks ? Deep experience in architectural design, NFR identification, infrastructure planning, and automation scalability ? Hands-on expertise in code quality checks, troubleshooting, and analytical analysis ? Ability to work in a self-driven manner while being a great team player Good to Have: ? Experience in Cloud-based automation deployments (AWS/Azure/GCP) ? Exposure to AI-driven decision automation and self-learning bots ? Familiarity with Low-Code/No-Code (LCNC) automation platforms Roles and Responsibilities: ? Lead the architecture, design, and implementation of intelligent automation solutions ? Define and develop technical standards, best practices, governance models, solution design documents (SDDs) and technical design documents (TDDs) for automation programs ? Identify and document Non-Functional Requirements (NFRs) and infrastructure requirements ? Perform code reviews, quality checks, and troubleshooting for automation solutions ? Collaborate with AI/ML, Data Science, and GenAI teams to integrate Agentic AI and Machine Learning models into automation workflows ? Work with business and IT teams to ensure scalable, high-performing automation solutions ? Provide technical mentorship to developers and ensure team alignment with automation strategies ? Drive continuous improvement and innovation in automation technologies and frameworks Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Join us as a Data Scientist at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of the Service Operations team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Data Scientist you should have experience with: Essential Skills Solid understanding of machine learning concepts and model deployment. Prior experience in a data science role, indicating a strong foundation in the field. Advanced coding proficiency in Python, with the ability to design, test, and correct complex scripts. Proficiency in SQL, for managing and manipulating data. Working in an Agile manner and leading Agile teams using Jira. Some Other Highly Valued Skills Include Excellent modelling skills, as evidenced by an advanced degree or significant experience. Strong quantitative and statistical skills, enabling logical and methodical problem-solving. Good understanding and experience of big data technologies and the underlying approach. Good interpersonal skills for maintaining relationships with multiple business areas, including senior leadership and compliance. Ability to manage laterally and upwards across multiple discipline technical areas. Version control using Bitbucket, Gitlab, etc. Cloud experience (AWS, Azure or GCP) You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To use innovative data analytics and machine learning techniques to extract valuable insights from the bank's data reserves, leveraging these insights to inform strategic decision-making, improve operational efficiency, and drive innovation across the organisation. Accountabilities Identification, collection, extraction of data from various sources, including internal and external sources. Performing data cleaning, wrangling, and transformation to ensure its quality and suitability for analysis. Development and maintenance of efficient data pipelines for automated data acquisition and processing. Design and conduct of statistical and machine learning models to analyse patterns, trends, and relationships in the data. Development and implementation of predictive models to forecast future outcomes and identify potential risks and opportunities. Collaborate with business stakeholders to seek out opportunities to add value from data through Data Science. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences– all created by our global community of developers and creators. At Roblox, we’re building the tools and platform that empower our community to bring any experience that they can imagine to life. Our vision is to reimagine the way people come together, from anywhere in the world, and on any device. We’re on a mission to connect a billion people with optimism and civility, and looking for amazing talent to help us get there. A career at Roblox means you’ll be working to shape the future of human interaction, solving unique technical challenges at scale, and helping to create safer, more civil shared experiences for everyone. About The Role The Roblox Operating System (ROS) team is responsible for the foundational technology and services that power all experiences on Roblox. This critical team ensures a seamless, performant, and reliable platform for our global community of users and developers. You will be the first Product Manager hire for our India office, reporting to Theresa Johnson, the Head of Product for ROS. You will play a pivotal role in building and enhancing our data analytics capabilities within the Roblox operating system, collaborating closely with the India-based Data Engineering team, which includes an Engineering Manager, three engineers, and multiple data scientists. This is a full time onsite role based out of our Gurugram office. Shift Time: 2:00PM - 10:30PM IST (Cabs will be provided) You Will Collaborate with data engineering and product engineering teams in India to build integrated analytics tooling. Develop cross-functional data visualization and reporting capabilities. Implement advanced insights extraction methodologies. Develop self-service data exploration tools. Integrate data analytics capabilities into Roblox operating system. Ensure seamless data flow across organizational platforms. Implement cutting-edge data infrastructure solutions. Build a scalable data registry that will allow us to understand, register, classify and govern data across all of ROS. This will involve partnering with data engineers to build and maintain robust data pipelines integrating diverse sources like HR systems (Workday, Greenhouse), collaboration tools (Slack, Zoom), business applications (Pigment, Zendesk), and internal Roblox applications. Partner with Data Scientists to process and transform data into actionable insights, developing systems that generate builder development signals and promote positive behaviors. Contribute to achieving key outcomes such as reducing data access request resolution time by 60%, increasing self-service data exploration adoption by 75%, and achieving 99.9% data pipeline reliability. You Have A Bachelor’s degree or equivalent experience in Computer Science, Computer Engineering, or a similar technical field. 8+ years of product management experience, with a focus on data platforms, analytics, or developer tools. Strong understanding of data infrastructure, data warehousing, and ETL processes, including experience with data governance tools focusing on discovery, cataloging, metadata management, classification, and quality assurance. Proven ability to work autonomously and define product scope in ambiguous environments. Experience collaborating with engineering and data science teams to deliver impactful data products. Excellent communication and interpersonal skills, with the ability to articulate complex technical concepts to diverse audiences. You Are Someone with strong product intuition of what we should be doing rather than just following instructions. Highly organized with a strong sense of urgency. A collaborative team player who can navigate cross-functional partnerships effectively. Adaptable and comfortable working in a fast-paced, evolving environment. A strategic thinker with a bias for action and a focus on delivering measurable results. Roles that are based in our San Mateo, CA Headquarters are in-office Tuesday, Wednesday, and Thursday, with optional in-office on Monday and Friday (unless otherwise noted). Roblox provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Roblox also provides reasonable accommodations for all candidates during the interview process. Show more Show less

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Summary Strategy As part of FM – Funds and Securities Services Technology, the developer is recruited to Develop and Deliver solution to support various initiatives that enables Operations to fulfil Client requirements. Business Technical Requirement Experience in Designing and Implementing Enterprise applications in Java/.Net Should have experience in Oracle /SQL server. Good knowledge on developing stored procedures using Oracle PL\SQL. Knowledge on Big Data concepts and tech stack such as Hadoop, Hive / Spark / Sqoop. Should be able to work on data extraction and data lake initiatives. DevOps (ADO, JIRA, Jenkins, Ansible, Github) exposure / experience. Knowledge on AWS/Azure Cloud native and VM concepts. Proficient in Container Proficiency in Oracle Sql, SqlServer and DBA. Working experience in solution design, capacity plan and sizing. Functional Requirements Experience in Fund Accounting, Transfer Agency and (or) Hedge Fund Administration. Knowledge of market instruments and conventions. Specialism within Fund Accounting/ client reporting/ investment operations. Hands –on in MultiFonds Fund Administration and Global Investor products or equivalent Fund Accounting Products Key Responsibilities Risk Management Proactively identify and track Obsolescence of Hardware and Software components, including OS or CVE patches, for Funds Apps and interfaces Governance Develop and Deliver as per SDF mandated process. Follow Release management standards and tools to deploy the deliverable to production. Handover to Production Support as per process Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the Agile Squad to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders FM – Funds and Securities Services Operations, Technology and Production Support. Other Responsibilities Coordinates between Product Vendor and Business Stakeholders for requirement finalization. Coordinates between Product Vendor and Business Stakeholders for requirement finalization. Understand Functional Requirement. Provide solutions by developing the required components or reports. Unit test and support SIT and UAT. Follow SCB change management process to promote the developed components to production. Proactively input to solution design including architectural view of our technology landscape, experience on data optimisation solutions (DB table mapping, data logic, development code design, etc.) Participates in identification of non-functional requirements like security requirements, performance objectives. Coordinates between various internal support teams. Picks up new technologies with ease, solves complex technical problems and multitasks between different projects Follow SCB change management process to promote the developed components to production. Proactively input to solution design including architectural view of our technology landscape. Participates in identification of non-functional requirements like security requirements, performance objectives. Coordinates between various internal support teams. Picks up new technologies with ease, solves complex technical problems and multitasks between different projects Skills And Experience Windows/Linux WebLogic, citrix, MQ, Solace AWS/Azure Cloud native and VM concepts PL\SQL Domain Experience in Fund Accounting, Transfer Agency and (or) Hedge Fund Administration DevOps Qualifications Degree in Computer Science, MCA or Equivalent. 7 to 10 years of prior work experience in stated Technology About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Key Responsibilities: Design, develop, test, and deploy scalable web applications using Python and related technologies. Build responsive and interactive user interfaces using HTML, CSS, and JavaScript frameworks. Develop and maintain automation scripts using Selenium for testing and data extraction. Integrate machine learning models into production environments. Collaborate with Stakeholders, and other developers to deliver high-quality products. Write clean, maintainable, and efficient code following best practices. Troubleshoot, debug, and upgrade existing systems. Participate in code reviews and contribute to team knowledge sharing. Required Skills & Qualifications: 2+ years of professional experience in Python full stack development. Proficiency in Python and frameworks such as Django or Flask. Strong front-end development skills with HTML, CSS, Node.JS and JavaScript (React.js or Vue.js is a plus). Experience with Selenium for automation and testing. Familiarity with Machine Learning concepts and libraries (e.g., scikit-learn, TensorFlow, or PyTorch). Experience with RESTful APIs and third-party integrations. Knowledge of version control systems like Git. Understanding of database systems (SQL and NoSQL). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with containerization tools like Docker. Exposure to CI/CD pipelines and DevOps practices. Knowledge of Agile/Scrum methodologies. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field.Minimum of 2+ years of experience as a Python Developer.Relevant certifications in Python development or related technologies are a plus. Additional Information Work from Office only Shift Timings: 4:00 pm - 1:00 am OR 6:00 pm - 3:00 am Show more Show less

Posted 1 week ago

Apply

Exploring Extraction Jobs in India

The extraction job market in India is a thriving industry with numerous opportunities for job seekers. Extraction jobs involve extracting valuable resources such as oil, gas, minerals, and other natural resources from the earth. These roles are essential for the growth and development of various sectors in the country.

Top Hiring Locations in India

  1. Mumbai
  2. Delhi
  3. Bangalore
  4. Kolkata
  5. Hyderabad

These cities are known for their active hiring in extraction roles, with a high demand for skilled professionals in the industry.

Average Salary Range

The average salary range for extraction professionals in India varies based on experience and skills. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10-15 lakhs per annum.

Career Path

In the extraction industry, a typical career path may involve starting as a Junior Engineer or Technician, moving on to roles such as Senior Engineer, Project Manager, and eventually reaching positions like Operations Manager or Director.

Related Skills

In addition to extraction skills, professionals in this field are often expected to have knowledge of geology, environmental regulations, safety procedures, and project management.

Interview Questions

  • What is the importance of exploration in the extraction industry? (basic)
  • How do you ensure compliance with environmental regulations during extraction processes? (medium)
  • Can you explain the difference between surface mining and underground mining? (medium)
  • What are some of the challenges faced in the extraction industry, and how would you address them? (medium)
  • Describe a successful extraction project you were involved in and the role you played. (advanced)
  • How do you stay updated on new technologies and advancements in the extraction industry? (basic)
  • What steps would you take to improve efficiency in extraction processes? (medium)
  • How do you prioritize safety in extraction operations? (medium)
  • Can you discuss a time when you had to handle a difficult situation during an extraction project? (advanced)
  • What software tools or technologies are you proficient in for extraction work? (basic)
  • Explain the importance of risk assessment in extraction operations. (medium)
  • How do you ensure quality control in extraction processes? (medium)
  • What are the key factors to consider when selecting a site for extraction activities? (medium)
  • How do you manage stakeholder relationships in the extraction industry? (medium)
  • Describe a time when you had to work under strict deadlines in an extraction project. How did you handle it? (advanced)
  • What strategies would you implement to reduce the environmental impact of extraction activities? (medium)
  • Can you discuss a time when you had to troubleshoot a technical issue during an extraction operation? (advanced)
  • How do you handle conflicts within a team working on an extraction project? (medium)
  • What are the different types of extraction methods used in the industry, and when would you use each? (advanced)
  • How do you ensure cost-effectiveness in extraction operations? (medium)
  • Explain the role of technology in modern extraction processes. (basic)
  • What are the key components of a successful extraction plan? (medium)
  • How do you assess the feasibility of an extraction project? (medium)
  • Can you discuss a time when you had to adapt to unexpected changes in an extraction project? (advanced)
  • How do you ensure the health and safety of workers in extraction operations? (medium)

Closing Remark

As you prepare for interviews and explore opportunities in the extraction industry in India, remember to showcase your skills, experience, and passion for the field. With the right preparation and confidence, you can excel in extraction roles and contribute to the growth of this dynamic industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies