Jobs
Interviews

24278 Etl Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle supply chain and operations at PwC, you will specialise in providing consulting services for Oracle supply chain and operations applications. You will analyse client needs, implement software solutions, and offer training and support for seamless integration and utilisation of Oracle supply chain and operations applications. Working in this area, you will enable clients to optimise their supply chain processes, improve operational efficiency, and achieve their strategic objectives. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role / Job Title Exp. Sr . Associate Tower Oracle Exp : 5 years Key Skills FAW/OAC/ADW/IOT Educational Qualification BE / B Tech / ME / M Tech / B.SC / B.Com / BBA Work Location India Job Description As an Experienced Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Minimum 2 years of experience on Oracle's Cloud-based analytics platforms including OAC/ADW/ODI and/or FAW. Strong hands-on expertise in OAC including Analytics, Data Visualization, and Semantic Model Development. Very good development experience in OAC-Reports and dashboards using measures, Filters, calculated measures, calculated items etc Must be able to do Report testing process Experience migrating from OBIEE to OAC. Experience migrating between OAC Instances. Very Good Understanding of DatawareHousing Concepts and Data Warehouse modeling. Thorough handson experience on SQL(on any RDBMS Source). Able to troubleshoot report errors and issues on OAC. Hands On knowledge on Building, Analysis and visualizations based on Datasets created using SQL or Excel Data Sources. Good Knowledge on RPD Modeling and Usage of Data modelers on OAC. Able to troubleshoot report errors and issues on OBIEE/OAC and understand the tool limitations for OAC. Should have experience in performance tuning OAC Analysis, this includes analyzing the Explain Plan of the query, tuning the data model as well as making modifications to the tables such as indexing. Should have good knowledge of Coding, Debugging and Design and Documentation. Understanding of the flow of data between ERP and Data Warehouse. Preferable to Model and Build BI Publisher Reports. Any knowledge on PLSQL/ODI/Any ETL Tool would be preferable. Working on Multidimensional sources (like Essbase) is a plus. Any work on OTBI will be a plus. Expertise on the Oracle Analytics Cloud Tool Knowledge on BIApps concepts is preferable. Familiar with Upgrade activities and Issues encountered during Upgrade from OBIEE to OAC. Expertise in SQl/Knowledge of any ETL tool is preferable. Knowledge on FAW (ERP and SCM)/ADW/OAC (Classic, Data Visualization, Semantic Model Development)/ODI is plus. Use feedback and reflection to develop self-awareness, personal strengths and address development areas. Proven track record as an SME in chosen domain. Ability to come up with Client POC/POV for integrating/increasing adoption of emerging Tech. like BlockChain, AI et al with the product platform they are associated with. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Adherence to SLAs, experience in incident management, change management and problem management. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct.. Managed Services - Application Evolution Services At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Everyday we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly-skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our client’s are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global Managed Services platform, we provide Application Evolution Services (formerly Application Managed Services), where we focus more so on the evolution of our clients’ applications and cloud portfolio. Our focus is to empower our client’s to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Application Evolution Services (AES) team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 3 days ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Job Summary: We are looking for an experienced and technically strong Cloud Infrastructure Automation Engineer to join our team. The ideal candidate will have 9+ years of overall cloud experience , including 5+ years of automation experience , and will be responsible for building, automating, and maintaining robust infrastructure on Oracle Cloud Infrastructure (OCI) . The role includes end-to-end automation using Terraform , scripting, CI/CD integration, and operational excellence using modern DevOps practices. Exposure to other cloud platforms (AWS, Azure), container orchestration (Kubernetes/OKE), open-source monitoring, and security frameworks is highly desirable. Key Responsibilities: Design, automate, and manage OCI infrastructure using Terraform and Infrastructure as Code principles. Develop and integrate CI/CD pipelines using tools like Jenkins, Git, GitHub Actions, or GitLab CI/CD. Deploy and manage containerized applications using Kubernetes, preferably Oracle Kubernetes Engine (OKE). Implement monitoring solutions using Prometheus, Grafana, and other open-source observability tools. Automate infrastructure provisioning and system configuration using Bash, Python, or Shell scripting. Architect and implement secure cloud environments, ensuring best practices in networking, identity and access management, and data protection. Design and support cloud security frameworks, applying zero-trust principles and governance models. Collaborate in cross-functional teams to provide guidance on cloud architecture, automation patterns, and security controls. Troubleshoot and resolve infrastructure and deployment issues efficiently in production and non-production environments. Participate in planning and architecture discussions to deliver robust and scalable infrastructure solutions. Required Qualifications: 9+ years of overall cloud experience, with 5+ years in cloud automation. Proven hands-on experience with Oracle Cloud Infrastructure (OCI). Strong expertise in Terraform for provisioning OCI resources. High proficiency in scripting and programming languages (e.g., Bash, Python, Shell). Solid experience deploying and managing workloads on Kubernetes, ideally on OKE. Experience building monitoring dashboards and alerts using Prometheus and Grafana. Strong understanding of cloud networking, security, and IAM models. Hands-on experience in designing cloud architecture and developing secure infrastructure frameworks. Familiarity with modern CI/CD and DevOps tools and methodologies. Strong analytical, troubleshooting, and communication skills. Preferred Skills (Good to Have): Experience with AWS or Azure cloud platforms. Familiarity with ETL workflows and container lifecycle management (e.g., Docker). Exposure to secrets management, policy enforcement, and compliance automation. Knowledge of service mesh, ingress controllers, and advanced Kubernetes patterns. Certifications (Preferred): OCI Architect/Infrastructure Certification HashiCorp Terraform Associate DevOps/CI-CD certifications (e.g., CKA/CKAD) Security-related certifications (e.g., CCSP, OCI Security, CISSP) Career Level - IC3 Responsibilities Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description The role aims to leverage data analysis, engineering, and AI/ML techniques to drive strategic business decisions and innovations. This position is responsible for designing and implementing scalable data pipelines, developing innovative models, and managing cloud infrastructure to ensure efficient data processing and storage. The role also involves collaborating with cross-functional teams to translate business needs into technical solutions, mentoring junior team members, and staying abreast of the latest technological advancements. Effective communication, particularly in English, is essential to articulate complex insights and foster a collaborative environment. The ultimate goal is to enhance data-driven decision-making and maintain a competitive edge through continuous improvement and innovation. Data and AI Specialist, Consulting role Key Responsibilities: Python developer experienced with Azure Cloud using Azure Data bricks for Data Science: Create models and algorithms to analyze data and solve business problems Application Architecture: Knowledge of enterprise application integration and application design Cloud Management: Knowledge of hosting and supporting applications of Azure Cloud Data Engineering: Build and maintain systems to process and store data efficiently Collaboration: Work with different teams to understand their needs and provide data solutions. Share insights through reports and presentations Research: Keep up with the latest tech trends and improve existing models and systems Mentorship : Guide and support junior team members Must have: Python development in AI / ML and Data Analysis: Strong programming skills in Python or R, SQL Proficiency in statistical analysis and machine learning techniques Hands on experience in NLP and NLU Experience with data visualization and reporting tools (e.g., Power BI) Experience with Microsoft Power Platforms and SharePoint, including (e.g., Power Automate) Hands on experience if using SharePoint for content management Data Engineering: Expertise in designing and maintaining data pipelines and ETL processes Experience with data storage solutions (e.g. Azure SQL) Understanding of data quality and governance principles Experience with Databricks for big data processing and analytics Cloud Management: Proficiency in cloud platforms (e.g., Azure) Knowledge of hosting and supporting applications of Azure Cloud Knowledge of cloud security and compliance best practices Collaboration and Communication: Experience in agile methodologies and project management tools (e.g., Jira) Strong interpersonal and communication skills Ability to translate complex technical concepts into business terms Experience working in cross-functional teams Excellent English communication skills, both written and verbal Research and Development: Ability to stay updated with the latest advancements in data science, AI/ML, and cloud technologies Experience in conducting research and improving model performance Mentorship: Experience in guiding and mentoring junior team members Ability to foster a collaborative and innovative team environment Must exhibit following core behaviors: Taking ownership / accountability of the projects assigned Qualifications  Bachelor's, Master's in Computer Science, or MCA degree, Data Science, AI/ML, IT, or related fields  7-9 years of relevant experience  Proficiency in Python, R, cloud platforms (Azure), and data visualization tools like Power BI  Advanced certifications and experience with big data technologies, real-time data processing  Excellent English communication skills

Posted 3 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Data Engineer 📍 Location: Gurugram, India 🕒 Experience: 6–8 years 🧑 💻 Employment Type: Full-time Key Responsibilities Design, build, and optimize scalable data pipelines to support advanced Media Mix Modeling (MMM) and Multi-Touch Attribution (MTA) models. Collaborate with Data Scientists to prepare data for training, validation, and deployment of machine learning models and statistical algorithms. Ingest and transform large volumes of structured and unstructured data from multiple sources, ensuring data quality and integrity. Partner with cross-functional teams (AdSales, Analytics, and Product) to deliver reliable data solutions that drive marketing effectiveness and campaign performance. Automate data workflows and build reusable components for model deployment, data validation, and reporting. Support data scientists with efficient access to cleaned and transformed data, optimizing for both performance and usability. Contribute to the design of a unified data architecture supporting AdTech, OTT, and digital media ecosystems . Stay updated with the latest trends in data engineering, AI-driven analytics, and cloud-native tools to improve data delivery and model deployment processes. Required Skills & Experience 6+ years of hands-on experience in Data Engineering , data analytics, or related roles. At least 3 years working in AdTech , AdSales , or digital media analytics environments. Experience supporting MMM and MTA modeling efforts with high-quality, production-ready data pipelines. Proficiency in Python , SQL , and data transformation tools; experience with R is a plus. Strong knowledge of data modeling , ETL pipelines , and handling large-scale datasets using distributed systems (e.g., Spark, AWS, or GCP). Familiarity with cloud platforms (AWS, Azure, or GCP) and data services (S3, Redshift, BigQuery, Snowflake, etc.). Experience with BI tools such as Tableau, Power BI, or Looker for report automation and insight generation. Solid understanding of statistical techniques , A/B testing , and model evaluation metrics. Excellent communication and collaboration skills to work with both technical and non-technical stakeholders. Preferred Qualifications Experience in media or OTT data environments. Exposure to machine learning model deployment , model monitoring, and MLOps practices. Knowledge of Kafka , Airflow , or dbt for orchestration and transformation.

Posted 3 days ago

Apply

3.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

The Position We are seeking a skilled Data Engineer to join our dynamic team. In this role, you will play a pivotal part in designing and implementing custom solutions that support complex financial and IP calculations, reporting, and data transformations. Your work will directly contribute to improving our clients' operational efficiency and decision-making capabilities. What You Will Do Problem-Solving: Develop innovative solutions to complex challenges in financial calculations, rights management, and process optimization. Data Engineering Solutions: Design, build, and maintain scalable data pipelines for migration, cleansing, transformation, and integration tasks, ensuring high-quality data outcomes. Database Development & Maintenance: Configure, implement, and refine stored procedures and queries to ensure optimal performance, scalability, and maintainability of database systems. ETL & Data Migration: Develop robust ETL (Extract, Transform, Load) processes that integrate data from diverse sources, ensuring seamless migration and transformation for downstream analytics and reporting. Automation & Scripting: Create and implement automated scripts and tools to streamline routine database tasks, reduce manual intervention, and improve overall operational efficiency. Collaboration: Partner with cross-functional teams to align data engineering efforts with broader business objectives and deliver seamless solutions that drive value across the organization. IP Commerce Data Expertise: Leverage deep knowledge of financial and rights data to develop creative solutions that address client needs and advance business goals. Process Improvement: Continuously identify opportunities to optimize workflows, automate repetitive tasks, and enhance efficiency in data processing and delivery. Must-Have What you will bring to the role : Minimum 3-5 years of experience in an database developer or analyst position. Bachelors in Computer Science, Engineering or equivalent work experience. Exceptional analytical thinking and problem-solving capabilities. Strong verbal and written communication skills with the ability to articulate technical concepts clearly. Proficiency in analyzing complex financial or IP data sets. Hands-on experience with engineering principles, including designing and implementing scalable solutions. Strong attention to detail and commitment to ensuring data accuracy and integrity. Preferred Experience working with SQL and/or Python for data manipulation and analysis. Experience working in finance or IP-related industries, with an understanding of their unique challenges and requirements. Familiarity with handling large-scale datasets and cloud-based platforms (e.g., AWS, Azure, Google Cloud). Knowledge of DevOps practices and CI/CD pipelines to streamline database management and deployment. Understanding of data warehousing architectures and business intelligence tools for advanced analytics. Certifications in relevant database technologies (e.g., Microsoft Certified: Azure Database Administrator Associate or Oracle Certified Professional) are a bonus Shift - Flexible (US & UK shift) Equal Employment Opportunity Rightsline is an equal opportunity workplace. All candidates will be afforded equal opportunity through the recruiting process. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, disability, gender identity and/or expression. We are dedicated to growing a diverse team of highly talented individuals and creating an inclusive environment where everyone feels empowered to bring their authentic selves to work. Apply Today If you want to join a company that strives for a mission, purpose and making an impact, we encourage you to apply today.

Posted 3 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Area(s) of responsibility JD- Qlik Sense Designing, developing, and maintaining interactive dashboards and reports using Qlik Sense, extracting data, and managing Qlik Sense servers, while also ensuring data integrity and performance optimization Develop Innovative and Visually Appealing Qlik Sense Dashboards and Reports that Provide Actionable Insights to Stakeholders. Good experience on offshore team lead Should have good experience on onsite and offshore model as a lead/SPOC Should be able to understand requirement by direct interact with users, create BRD, TSD, handle offshore team, provide technical support Can be able to handle end to end activities Must be good at Data transformation, the creation of QVD files and set analysis. Experienced in application designing, architecting, development and deployment using Qlik Sense. Must be efficient in front-end development and know visualization best practices. Strong database designing and SQL skills Experienced in data integration through extracting, transforming and loading (ETL) data from various sources. Able to comprehend and translate complex and advanced functional, technical and business requirements into executable architectural designs. Hands on Experience in Design, Implement, Test and Support Reports and Dashboards Within in the agreed SLA. Working Experience on charts in Qlik sense such as KPI, Line, Straight table, Pivot table, Pie, Bar, Combo and Radar, Map …etc. Strong Working Experience on SET Analysis or Set Expressions and Selection States. Working knowledge on YTD, LYTD, QTD, LQTD, MTD, LMTD, WTD, LWTD creation using Set Analysis…etc. Experience in Qlik Native Functions like String, Date, Aggregate, Row, Conditional…. Etc.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Senior Technical Trainer – Cloud, Data & AI/ML Location: Pune Experience Required : 10+ Years About the Role: We’re looking for an experienced and passionate technical trainer who can help elevate our teams’ capabilities in cloud technologies, data engineering, and AI/ML. This role is ideal for someone who enjoys blending hands-on tech skills with a strong ability to simplify, teach, and mentor. As we grow and scale at Meta For Data, building internal expertise is a key part of our strategy—and you’ll be central to that effort. What You’ll Be Doing: Lead and deliver in-depth training sessions (both live and virtual) across areas like cloud architecture, data engineering, and machine learning. Build structured training content including presentations, labs, exercises, and assessments. Develop learning journeys tailored to different experience levels and roles—ranging from new hires to experienced engineers. Continuously update training content to reflect changes in tools, platforms, and best practices. Collaborate with engineering, HR, and L&D teams to roll out training schedules, track attendance, and gather feedback. Support on-going learning post-training through mentoring, labs, and knowledge checks. What We’re Looking For: Around 10 years of experience in a mix of software development, cloud/data/ML engineering, and technical training. Deep familiarity with at least one cloud platform (AWS, Azure, or GCP); AWS or Azure is preferred. Strong grip on data platforms, ETL pipelines, Big Data tools (like Spark or Hadoop), and warehouse systems. Solid understanding of the AI/ML lifecycle—model building, tuning, deployment—with hands-on experience in Python-based libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Confident communicator who’s comfortable speaking to groups and explaining complex concepts simply. Bonus if you hold any relevant certifications like AWS Solutions Architect, Google Data Engineer, or Microsoft AI Engineer. Nice to Have: Experience creating online training modules or managing LMS platforms. Prior experience training diverse audiences: tech teams, analysts, product managers, etc. Familiarity with MLOps and modern deployment practices for AI models. Why Join Us? You’ll have the freedom to shape how technical learning happens at Meta For Data. You’ll be part of a team that values innovation, autonomy, and real impact. Flexible working options and a culture that supports growth - for our teams and our trainers.

Posted 3 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Data Engineer We are looking for an experienced Data Engineer with strong expertise in Snowflake, dbt, Airflow, AWS, and modern data technologies like Python, Apache Spark, and NoSQL databases. The role focuses on designing, building, and optimizing data pipelines to support analytical and regulatory needs in the banking domain. Key Responsibilities Design and implement scalable and secure data pipelines using Airflow, dbt, Snowflake, and AWS services. Develop data transformation workflows and modular SQL logic using dbt for a centralized data warehouse in Snowflake. Build batch and near real-time data processing solutions using Apache Spark and Python. Work with structured and unstructured banking datasets stored across S3, NoSQL (e.g., MongoDB, DynamoDB), and relational databases. Ensure data quality, lineage, and observability through logging, testing, and monitoring tools. Support data needs for compliance, regulatory reporting, risk, fraud, and customer analytics. Ensure secure handling of sensitive data aligned with banking compliance standards (e.g., PII masking, role-based access). Collaborate closely with business users, data analysts, and data scientists to deliver production-grade datasets. Implement best practices for code versioning, CI/CD, and environment management Required Skills And Qualifications 5-8 years of experience in data engineering, preferably in banking, fintech, or regulated industries. Hands-on experience with: Snowflake (data modeling, performance tuning, security) dbt (modular SQL transformation, documentation, testing) Airflow (orchestration, DAGs) AWS (S3, Glue, Lambda, Redshift, IAM) Python (ETL scripting, data manipulation) Apache Spark (batch/stream processing using PySpark or Scala) NoSQL databases (e.g., DynamoDB, MongoDB, Cassandra) Strong SQL skills and experience in performance optimization and cost-efficient query design. Exposure to data governance, compliance, and security in the banking industry. Experience working with large-scale datasets and complex data transformations. Familiarity with version control (e.g., Git) and CI/CD pipelines. Preferred Qualifications Prior experience in banking/financial services Knowledge of Kafka or other streaming platforms. Exposure to data quality tools (e.g., Great Expectations, Soda). Certifications in Snowflake, AWS, or dbt. Strong communication skills and ability to work with cross-functional teams. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, result-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual bonus. Great career growth and development opportunities in a global organization A flexible approach to work There are plenty of amazing opportunities at Convera for talented, creative problem solvers who never settle for good enough and are looking to transform Business to Business payments. Apply now if you’re ready to unleash your potential.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Software Engineer Consultant/Expert 34326 Location: Chennai Work Type: Contract (Onsite) Compensation: Up to ₹21–24 LPA (Based on experience) Notice Period: Immediate joiners preferred Experience: Minimum 7+ years (9 preferred) Position Summary Seeking a skilled and motivated Full Stack Java Developer to join a growing software engineering team responsible for building and supporting a global logistics data warehouse platform. This platform provides end-to-end visibility into vehicle shipments using GCP cloud technologies , microservices architecture , and real-time data processing pipelines . Key Responsibilities Design, develop, and maintain robust backend systems using Java, Spring Boot, and microservices architecture Implement and optimize REST APIs, and integrate with Pub/Sub, Kafka, and other event-driven systems Build and maintain scalable data processing workflows using GCP BigQuery, Cloud Run, and Terraform Collaborate with product managers, architects, and fellow engineers to deliver impactful features Perform unit testing, integration testing, and support functional and user acceptance testing Conduct code reviews and provide mentorship to other engineers to improve code quality and standards Monitor system performance and implement strategies for optimization and scalability Develop and maintain ETL/data pipelines to transform and manage logistics data Continuously refactor and enhance existing code for maintainability and performance Required Skills Strong hands-on experience with Java, Spring Boot, and full stack development Proficiency with GCP Cloud Platform, including at least 1 year of experience with BigQuery Experience with GCP Cloud Run, Terraform, and deploying containerized services Deep understanding of REST APIs, microservices, Pub/Sub, Kafka, and cloud-native architectures Experience in ETL development, data engineering, or data warehouse projects Exposure to AI/ML integration in enterprise applications is a plus Preferred Skills Familiarity with AI agents and modern AI-driven data products Experience working with global logistics, supply chain, or transportation domains Education Requirements Required: Bachelor’s degree in Computer Science, Information Technology, or related field Preferred: Advanced degree or specialized certifications in cloud or data engineering Work Environment Location: Chennai (Onsite required) Work closely with cross-functional product teams in an Agile setup Fast-paced, data-driven environment requiring strong communication and problem-solving skills Skills: rest apis,cloud run,bigquery,gcp,pub/sub,data,data engineering,kafka,microservices,terraform,cloud,spring boot,data warehouse,java,code,etl development,full stack development,gcp cloud platform

Posted 3 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Position Title: Software Engineer Consultant/Expert – GCP Data Engineer 34350 Location: Chennai Engagement Type: Contract Compensation: Up to ₹18 LPA Notice Period: Immediate joiners preferred Work Mode: Onsite Role Overview This role is for a proactive Google Cloud Platform (GCP) Data Engineer who will contribute to the modernization of a cloud-based enterprise data warehouse. The ideal candidate will focus on integrating diverse data sources to support advanced analytics and AI/ML-driven solutions, as well as designing scalable pipelines and data products for real-time and batch processing. This opportunity is ideal for individuals who bring both architectural thinking and hands-on experience with GCP services, big data processing, and modern DevOps practices. Key Responsibilities Design and implement scalable, cloud-native data pipelines and solutions using GCP technologies Develop ETL/ELT processes to ingest and transform data from legacy and modern platforms Collaborate with analytics, AI/ML, and product teams to enable data accessibility and usability Analyze large datasets and perform impact assessments across various functional areas Build data products (data marts, APIs, views) that power analytical and operational platforms Integrate batch and real-time data using tools like Pub/Sub, Kafka, Dataflow, and Cloud Composer Operationalize deployments using CI/CD pipelines and infrastructure as code Ensure performance tuning, optimization, and scalability of data platforms Contribute to best practices in cloud data security, governance, and compliance Provide mentorship, guidance, and knowledge-sharing within cross-functional teams Mandatory Skills GCP expertise with hands-on use of services including: BigQuery, Dataflow, Data Fusion, Dataform, Dataproc Cloud Composer (Airflow), Cloud SQL, Compute Engine Cloud Functions, Cloud Run, Cloud Build, App Engine Strong knowledge of SQL, data modeling, and data architecture Minimum 5+ years of experience in SQL and ETL development At least 3 years of experience in GCP cloud environments Experience with Python, Java, or Apache Beam Proficiency in Terraform, Docker, Tekton, and GitHub Familiarity with Apache Kafka, Pub/Sub, and microservices architecture Understanding of AI/ML integration, data science concepts, and production datasets Preferred Experience Hands-on expertise in container orchestration (e.g., Kubernetes) Experience working in regulated environments (e.g., finance, insurance) Knowledge of DevOps pipelines, CI/CD, and infrastructure automation Background in coaching or mentoring junior data engineers Experience with data governance, compliance, and security best practices in the cloud Use of project management tools such as JIRA Proven ability to work independently in fast-paced or ambiguous environments Strong communication and collaboration skills to interact with cross-functional teams Education Requirements Required: Bachelor's degree in Computer Science, Information Systems, Engineering, or related field Preferred: Master's degree or relevant industry certifications (e.g., GCP Data Engineer Certification) Skills: bigquery,cloud sql,ml,apache beam,app engine,gcp,dataflow,microservices architecture,cloud functions,compute engine,project management tools,data science concepts,security best practices,pub/sub,ci/cd,compliance,cloud run,java,cloud build,jira,data,pipelines,dataproc,sql,tekton,python,github,data modeling,cloud composer,terraform,data fusion,cloud,data architecture,apache kafka,ai/ml integration,docker,data governance,infrastructure automation,dataform

Posted 3 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Responsibility Data Handling and Processing: •Proficient in SQL Server and query optimization. •Expertise in application data design and process management. •Extensive knowledge of data modelling. •Hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric. •Experience working with Azure Databricks. •Expertise in data warehouse development, including experience with SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services). •Proficiency in ETL processes (data extraction, transformation, and loading), including data cleaning and normalization. •Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) for large-scale data processing. •Understanding of data governance, compliance, and security measures within Azure environments. Data Analysis and Visualization: •Experience in data analysis, statistical modelling, and machine learning techniques. •Proficiency in analytical tools like Python, R, and libraries such as Pandas, NumPy for data analysis and modelling. •Strong expertise in Power BI for data visualization, data modelling, and DAX queries, with knowledge of best practices. •Experience in implementing Row-Level Security in Power BI. •Ability to work with medium-complex data models and quickly understand application data design and processes. •Familiar with industry best practices for Power BI and experienced in performance optimization of existing implementations. •Understanding of machine learning algorithms, including supervised, unsupervised, and deep learning techniques. Non-Technical Skills: •Ability to lead a team of 4-5 developers and take ownership of deliverables. •Demonstrates a commitment to continuous learning, particularly with new technologies. •Strong communication skills in English, both written and verbal. •Able to effectively interact with customers during project implementation. •Capable of explaining complex technical concepts to non-technical stakeholders. Data Management: SQL, Azure Synapse Analytics, Azure Analysis Service and Data Marts, Microsoft Fabric ETL Tools: Azure Data Factory, Azure Data Bricks, Python, SSIS Data Visualization: Power BI, DAX

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About This Role Wells Fargo is seeking a Lead Analytics Consultant - People Analytics. As a consultant, you will work as analytics professional in HR People Analytics and Business Insights delivery team and will be responsible for effective delivery of projects as per the business priority. The incumbent is expected to be an expert into executive summary, people strategy, HR consulting, HR advisory, advanced analytics & data science and value addition to the projects. In This Role, You Will Advise line of business and companywide functions on business strategies based on research of performance metrics, trends in population distributions, and other complex data analysis to maximize profits and asset growth, and minimize operating losses within risk and other operating standards Provide influence and leadership in the identification of new tools and methods to analyze data Ensure adherence to compliance and legal regulations and policies on all projects managed Provide updates on project logs, monthly budget forecasts, monthly newsletters, and operations reviews Assist managers in building quarterly and annual plans and forecast future market research needs for business partners supported Strategically collaborate and consult with peers, colleagues, and mid-level to senior managers to resolve issues and achieve goals Lead projects, teams, or serve as a peer mentor to staff, interns and external contractors Required Qualifications: 5+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 5+ years of experience working with Tableau/Power BI/SQL Experience working with complex datasets using SQL 5+ years of experience with creating visualizations. Dashboarding experience involving multiple views that all respond to navigation/filter/etc. Ability to publish that can be reused for across dashboards/workbooks and used for self-service by other analysts working on the same domain (and/or, to reuse cube created by others where expedient). Demonstrate comprehensive understanding of HR business and related processes. Collaborate with cross-functional teams to address servicing challenges and optimize processes. Able to work as Individual Contributor and deliver end to end product development. Good experience working on SQL/PL-SQL Domain understanding of HR and its complete product lifecycle (Hire to Retire) will be an added advantage. Experience working with SAS programming. Knowledge on Tableau Prep and/or Alteryx a plus. Working on python or any Data science tools will be added advantage Knowledge on Tableau Prep and/or Alteryx a plus. Hands on experience in ETL development using any ETL tools. Good to have certifications in BI Reporting tools Data Management, or Data Engineering. Expected to learn the business aspects quickly, multitask and prioritize between projects. Dedicated, enthusiastic, driven and performance-oriented; possesses a strong work ethic and good team player. Job Expectations: Detail oriented, results driven, and can navigate in a quickly changing and high demand environment while balancing multiple priorities. Simple work documentation skills. Requirements, query documentation, testing. Consultative skills: should have the ability to rationalize business need and solution design from people not knowing how to ask precisely for what they need. Strong written and verbal communication, presentation, and inter-personal skills. Ability to perform analysis, build hypothesis, draw conclusions, and communicate clear, actionable recommendation to business leaders & partners. Ability to interact with integrity and a high level of professionalism with all levels of team members and management. Posting End Date: Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-446112

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Overview: As a Senior Analytics Engineer, you will be responsible for the architecture, development and implementation of robust data engineering solutions to support our organization's data-driven initiatives to ensure data accuracy and enable data-driven decision-making across the organization. The ideal candidate will possess a minimum of 5 years of hands-on experience in data engineering on high-performing teams. Expertise in DataBricks, dbt, SQL, & Python is a critical requirement for this role. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain ETL (Extract, Transform, Load) pipelines primarily using Databricks, Dbt, and SQL to collect and process data from various sources into a centralized data warehouse. Data Modeling: Create and maintain data models, data dictionaries, and documentation to support efficient data analysis and reporting. Data Quality Assurance: Implement data validation and cleansing processes to ensure data accuracy, consistency, and reliability. Data Governance: Ensure data security, compliance, and governance standards are met, and contribute to data governance initiatives. Mentor and develop the analytics engineering team, and provide best practice guidance to broader analytics community. Collaboration: Collaborate with cross-functional teams, including data scientists, business analysts, and stakeholders, to understand their data needs and deliver solutions. Performance Optimization: Continuously monitor and optimize data pipelines and analytics processes for efficiency and scalability. Adhoc Data Analysis and Reporting/Dashboard Development: Perform exploratory data analysis, develop data visualizations, and generate actionable insights to support business decision-making. Stay Current: Stay up-to-date with emerging trends and technologies in data engineering and analytics, and make recommendations for their adoption. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum 5+ years of hands-on experience in data engineering Expertise in building data pipelines and ETL processes using Data Bricks, Dbt and Python Strong understanding of data warehousing concepts and methodologies. Experience with cloud platforms such as Azure, AWS or GCP Excellent communication and interpersonal skills Excellent problem-solving skills and attention to detail Strong communication and teamwork abilities. Knowledge of data security and compliance standards is a plus

Posted 3 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Principal Data Engineer Primary Skills Data Modeling (ER Studio/ER Win) Specialization Job requirements About Brillio: Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction. Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year. 10+ years of overall IT experience with more than 2+ years of experience in Snowflake with hands on experience in modelling In-depth understanding of Data Lake/ODS, ETL concept and modeling structure principles. Ability to partner closely with the clients to deliver and drive Physical and logical model solutions Experience in Data warehousing - Dimensions, Facts, and Data modeling. Excellent communication and ability to lead teams. Good to have exposure to AWS eco systems. Excellent communication and ability to lead teams. Technical Skills Data Modeling concepts (Advanced Level) Modeling experience across large volume-based environments Experience with ER Studio Overall understanding of Database space (Databases, Data Warehouses, Reporting, Analytics) Well versed with Data Modeling platforms like SQLDBM etc Strong skills in Entity Relationship Modeling Good knowledge about Database Designing Administration, Advanced Data Modeling for Star Schema design Good understanding of Normalized Denormalized Star Snowflake design concepts SQL query development for investigation and analysis

Posted 3 days ago

Apply

360.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Us: MUFG Bank, Ltd. is Japan’s premier bank, with a global network spanning in more than 40 markets. Outside of Japan, the bank offers an extensive scope of commercial and investment banking products and services to businesses, governments, and individuals worldwide. MUFG Bank’s parent, Mitsubishi UFJ Financial Group, Inc. (MUFG) is one of the world’s leading financial groups. Headquartered in Tokyo and with over 360 years of history, the Group has about 120,000 employees and offers services including commercial banking, trust banking, securities, credit cards, consumer finance, asset management, and leasing. The Group aims to be the world’s most trusted financial group through close collaboration among our operating companies and flexibly respond to all the financial needs of our customers, serving society, and fostering shared and sustainable growth for a better world. MUFG’s shares trade on the Tokyo, Nagoya, and New York stock exchanges. MUFG Global Service Private Limited: Established in 2020, MUFG Global Service Private Limited (MGS) is 100% subsidiary of MUFG having offices in Bengaluru and Mumbai. MGS India has been set up as a Global Capability Centre / Centre of Excellence to provide support services across various functions such as IT, KYC/ AML, Credit, Operations etc. to MUFG Bank offices globally. MGS India has plans to significantly ramp-up its growth over the next 18-24 months while servicing MUFG’s global network across Americas, EMEA and Asia Pacific. Position Title: DEBI Senior Analyst Corporate Title: Senior Analyst Location: MUFG Global Services Pvt.Ltd., Bhartiya Centre for Information Technology, Thani Sandra, Main Road, Bengaluru, Karnataka. Working Hour: Primarily HK or UK hours ( 5:30am to 2:30pm or 1:30pm to 10:30pm). Job Profile: Position details: To perform BAU application and operation support on multiple Data Management Systems across MUFG Bank EMEA and MUS International. The individual is to work as the senior analyst for applications within the DEBI (Data Engineering and Business Intelligence) BAU team which supports various applications and manages resolving incidents, problems and changes. Roles and Responsibilities: Delivery: Work in Shifts based on rota HK and EMEA hours. Identify and resolve technical issues. Maintaining and updating system technical documents and service operation manuals. To ensure the support provided is within the Service Level agreement set with business and technology groups. Ensure IT governance, standards and procedures are adhered to at all times. Plan and perform software releases as a part of the Release and Change process. Develop and deliver Application software changes according to requirement. Investigate production incidents/problems and provide solutions to fixing the issues. Assess impact on supported systems based on production changes. Follow standard tooling such as ServiceNow to manage Incidents, Problem and Change. Maintain a good and detailed knowledge of AMD DEBI (Data Engineering and Business Intelligence) applications so as to effectively provide BAU support. Ensure effective communications are maintained through timely sharing of information to technology and business stakeholders. To cover weekend support on a team rota basis. Risk & Compliance: Ensure all activity fully complies with the appropriate policies, procedures and controls Strict adherence to change management and privileged production access management processes Culture and Leadership: Promote the MUFG values-led culture which is inclusive and diverse. Promote a dynamic, delivery driven culture that works alongside business units to provide responsive resolutions and value driven solutions . Job Requirements: The Ideal candidate will be part of the DEBI Application support team, and we are looking for a person who has strong experience( 4+ years) working in large scale teams and supporting production applications. ETL Tools: SSIS, Snowflake Database: Oracle, SQL Server Reporting tools: SSRS, Power BI (Good to have) Batch Job scheduler like Control M (preferred) or any other similar scheduling tool PowerShell, Plsql, SQL server Stored Procs. Experience in application development, database development and system support. Experience with Java and python (advantageous) . Functional / Technical Competencies: Knowledge of the IT infrastructure (for example, hardware, databases, operating systems, local area networks) and the IT applications and service processes used within a Financial Services organization. Required to have a good understanding of software development methodologies. Good Data analysis and SQL Strong analytical and problem-solving skills. Possess good verbal and written communication skills. Personal Requirements: Strong decision-making skills, the ability to demonstrate sound judgement. A structured and logical approach to work. Strong problem-solving skills. A creative and innovative approach to work. Excellent interpersonal skills. The ability to manage large workloads and tight deadlines. Excellent attention to detail and accuracy. A calm approach, with the ability to perform well in a pressurized environment. Strong numerical skills

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are Lenovo. We do what we say. We own what we do. We WOW our customers. Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. BS/BA in Computer Science, Mathematics, Statistics, MIS, or related At least 5 years' experience in the data warehouse space. At least 5 years' experience in custom ETL/ELT design, implementation and maintenance. At least 5 years' experience in writing SQL statements. At least 3 years' experience with Cloud based data platform technologies such as Google Big Query, or Azure/Snowflake data platform equivalent. Ability in managing and communicating data warehouse plans to internal clients. We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, national origin, status as a veteran, and basis of disability or any federal, state, or local protected class.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

Maharashtra, India

On-site

Wissen Technology is Hiring fo r SQL With Python About Wissen Technology: Wissen Technology is a globally recognized organization known for building solid technology teams, working with major financial institutions, and delivering high-quality solutions in IT services. With a strong presence in the financial industry, we provide cutting-edge solutions to address complex business challenges. Role Overview: We are looking for a skilled and detail-oriented candidate with a strong foundation in SQL, Python, and data processing techniques. The ideal candidate is passionate about transforming raw data into meaningful insights and has hands-on experience across the data pipeline—from data wrangling to visualization. Experience: 3 -7 Years Location: Bengaluru Required Skills: Strong experience with SQL (e.g., joins, subqueries, CTEs, window functions). Proficiency in Python for data manipulation (e.g., pandas, NumPy). Experience working with relational databases like MySQL, PostgreSQL, SQL Server, or Oracle . Hands-on experience in data wrangling , cleaning, and feature engineering. Understanding of ETL processes and tools . Familiarity with version control systems like Git. Knowledge of data visualization techniques and tools. Strong problem-solving and analytical skills. The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products. We offer an array of services including Core Business Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud Adoption, Mobility, Digital Adoption, Agile & DevOps, Quality Assurance & Test Automation. Over the years, Wissen Group has successfully delivered $1 billion worth of projects for more than 20 of the Fortune 500 companies. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right ’. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them with the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients. We have been certified as a Great Place to Work® company for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. Great Place to W ork® Certification is recognized world over by employees and employers alike and is considered the ‘Gold Standard ’. Wissen Technology has created a Great Place to Work by excelling in all dimensions - High-Trust, High-Performance Culture, Credibility, Respect, Fairness, Pride and Camaraderie. Website : www.wissen.com LinkedIn : https ://www.linkedin.com/company/wissen-technology Wissen Leadership : https://www.wissen.com/company/leadership-team/ Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All Wissen Thought Leadership : https://www.wissen.com/articles/ Employee Speak: https://www.ambitionbox.com/overview/wissen-technology-overview https://www.glassdoor.com/Reviews/Wissen-Infotech-Reviews-E287365.htm Great Place to Work: https://www.wissen.com/blog/wissen-is-a-great-place-to-work-says-the-great-place-to-work-institute-india/ https://www.linkedin.com/posts/wissen-infotech_wissen-leadership-wissenites-activity-6935459546131763200-xF2k About Wissen Interview Process : https://www.wissen.com/blog/we-work-on-highly-complex-technology-projects-here-is-how-it-changes-whom-we-hire/ Latest in Wissen in CIO Insider: https://www.cioinsiderindia.com/vendor/wissen-technology-setting-new-benchmarks-in-technology-consulting-cid-1064.html

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Job Description Senior Data Engineer Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What You Get To Do Every Single Day Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems Basic Qualifications What you bring to the role: 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (We use Snowflake) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed “production-grade” projects with dbt Expert knowledge in python What Does Our Data Stack Looks Like ELT (Snowflake, Fivetran, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The Intelligent Heart Of Customer Experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.

Posted 3 days ago

Apply

1.0 years

5 - 14 Lacs

Mumbai, Maharashtra, India

On-site

Company: Mactores Website: Visit Website Business Type: Small/Medium Business Company Type: Service Business Model: B2B Funding Stage: Bootstrapped Industry: Data Analytics Salary Range: ₹ 5-14 Lacs PA Job Description Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization. As AWS Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts, and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision-making. You are passionate about the data quality of our business metrics and the flexibility of your solution that scales to respond to broader business questions. If you love to solve problems using your skills, then come join the Team Mactores. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself. What you will do? Write efficient code in - PySpark, Amazon Glue Write SQL Queries in - Amazon Athena, Amazon Redshift Explore new technologies and learn new techniques to solve business problems creatively Collaborate with many teams - engineering and business, to build better data products and services Deliver the projects along with the team collaboratively and manage updates to customers on time What are we looking for? 1 to 3 years of experience in Apache Spark, PySpark, Amazon Glue 2+ years of experience in writing ETL jobs using pySpark, and SparkSQL 2+ years of experience in SQL queries and stored procedures Have a deep understanding of all the Dataframe API with all the transformation functions supported by Spark 2.7+ You Will Be Preferred If You Have Prior experience in working on AWS EMR, Apache Airflow Certifications AWS Certified Big Data – Specialty OR Cloudera Certified Big Data Engineer OR Hortonworks Certified Big Data Engineer Understanding of DataOps Engineering

Posted 3 days ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Senior Data Specialist Primary Skills About Brillio: Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction. Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year. Lead and manage a team of ETL developers and deliver high-quality deliverables Collaborate with business and technical stakeholders to gather requirements and translate them into technical specifications Hands-on expertise in Advanced SQL for data extraction, transformation, and validation tasks Strong understanding of Data Warehousing concepts. Experience working with Snowflake for building and managing cloud-based data platforms Design, develop, and optimize ETL workflows using Informatica Intelligent Cloud Services (IICS) Ensure data quality, performance tuning, and best practices in ETL processes Provide technical guidance, mentoring, and code reviews for the ETL team Manage task prioritization, timelines, and issue resolution in a fast-paced environment Coordinate deployments and support activities across development, QA, and production environments Having Python Knowledge is a plus Specialization ETL Specialization: Data Specialist Job requirements Role – Data Specialist Years of Experience –10+ years Strong hands-on in Informatica Advanced SQL , PowerCenter ,and IICS Strong knowledge on Data model, Data warehouse, Worked in ETL job optimization techniques(mainly Pushdown optimization) Worked on ETL error troubleshooting. Worked on Shell script. Worked on Snowflake. Expertise on SQL / Advanced SQL queries. A good team player/lead and quick learner Python knowledge is added advantage Tech skills – Informatica Power Center, IICS, Snowflake, Advanced SQL, Shell Scripting

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Profile Summary Responsible for leading the development and optimization of data platforms and analytics systems, ensuring alignment with business requirements and best practices, while leading data security, training, and project management efforts for successful implementation and user empowerment. Lead the development and maintenance of data platforms, data product factory and analytics systems to support data-driven decision-making. Design and optimize data warehouse architecture to support efficient storage and retrieval of large datasets. Enable self-service data exploration capabilities for users to analyze and visualize data independently. Develop reporting and analysis applications to generate insights from data for business stakeholders. Design and implement data models to organize and structure data for analytical purposes. Implement data security and federation strategies to ensure the confidentiality and integrity of sensitive information. Optimize business intelligence production processes and adopt best practices to enhance efficiency and reliability. Drive and maintain relationships with vendors and oversee project management activities to ensure timely and successful implementation of data platforms and data product factory Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Relevant work experience as in data engineering based on the following number of years: Lead I: Five (5) years Lead II: Six (6) years Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 3 days ago

Apply

10.0 - 18.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. We’re bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Lead with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc…. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills Professional and open communication to all internal and external interfaces. Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing culture Strong analytical skills. Industry Specific Experience 10 -18 Years of experience of AI/ML project execution and AI/ML research Education – Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jul 4, 2025 Unposting Date Aug 3, 2025 Reporting Manager Title Head of Data Intelligence

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

Remote

Job ID: 40322 | Location: Airoli, Maharashtra, India This role primarily involves transforming data into actionable insights through the design and development of interactive dashboards and reports. By leveraging Power BI tools, the analyst enables data-driven decision-making, monitors key performance indicators (KPIs), and supports strategic and operational business needs across the organization. Responsibilities Data Integration & ETL Extract data from diverse sources (databases, APIs, files) Clean and transform data using Power Query Data Modeling Define relationships, hierarchies, measures, and tables Use DAX for advanced calculations and KPIs Report & Dashboard Development Design visually compelling reports with charts, maps, slicers Ensure interactivity and storytelling clarity Performance Optimization Refine data models and queries for speed Monitor refresh schedules and service performance \ Security & Deployment Implement row-level security and data access controls Deploy reports to the Power BI Service and govern access Stakeholder Collaboration Gather business requirements and iterate on solutions Translate non-technical needs into technical designs Maintenance & Support Troubleshoot issues, update dashboards, fix bugs Train users and maintain documentation Continuous Learning Stay current with new Power BI features, best practices Sometimes develop custom visuals or use Python/R integration Requirements At least a bachelor’s degree in data science, Mathematics, or Engineering with 5-7 years of practical experience in advanced analytics. Proficiency in creating interactive Power BI dashboards with large datasets. Experience in Extract, Transform, Load (ETL) processes, including Amazon Redshift, SAP BW, and languages like Python, R, or SQL. Good communication, project management and analytical skills. Candidates with financial reporting experience are preferred. Our Offer Our Offer Company Culture Be part of an amazing team, who will be there to support you. A forward-looking company, with a culture of innovation and a strong portfolio in sustainable technologies. Ongoing Professional Development Opportunities Inclusive Work Environment Approachable Leadership Long term growth opportunity Work-Life Balance Speak Up Culture Women's Inclusion Network of Clariant (WIN) Benefits Hybrid Work Model- 3 days in office and 2 days remote Child Day Care facility fully sponsored by Clariant In-house Cafeteria & Subsidized meals 30 Days Annual Paid Leaves Clariant-Sponsored Annual Health Check-Up Centralized Company Transport for Designated Routes (Regular shift) Employee Wellbeing & Assistance Program Group Medical Insurance, Group Personal Accident Insurance and Life Insurance Child Day Care facility fully sponsored by Clariant Maternity & Parental leave policies Performance-Based Competitive Annual Bonus Plan On-Site Medical Assistance for Employees: Doctor Visits Available Three Days a Week with a Medical Attendant Present Five Days a Week in the Medical Room Your Contact Adelaide D'Mello Clariant is a Swiss-based global specialty chemicals company, which is concentrated and developed in three business units: Care Chemicals, Catalysts and Adsorbents & Additives. Our purpose as a company is reflected in our tagline "Greater chemistry - between people and planet", which considers the principles of customer, innovation and people orientation, as well as a focus on creating solutions to foster sustainability in different industries by offering high-value and high-performance chemical specialties. At Clariant, we believe that diversity, equity and inclusion are essential to our success. We strive to cultivate a workplace where all employees feel welcomed, respected, supported, and valued. Our diverse workforce allows us to tap into a wealth of perspectives, experiences, and capabilities that drive innovation. We are committed to ensuring equal opportunities for professional growth and advancement across all levels of the organization, based on objective criteria and regardless of gender, gender identity, race, ethnicity, religion, protected veteran status ,age, disability, sexual orientation or other aspects of diversity in accordance with the relevant governing laws. By bringing together talented individuals with diverse backgrounds and viewpoints, we gain the agility to meet the evolving needs of our global customers and communities. Join our team to help advance our mission of fostering a culture of belonging where everyone can thrive. Learn more about Clariant Follow us on Facebook, Instagram, LinkedIn, X and YouTube Read more about our commitment for people - download our Employment Standards Brochure

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Customers & Products Job Family Group: Business Support Group Job Description: As bp transitions to a coordinated energy company, we must adapt to a changing world and maintain driven performance. bp’s customers & products (C&P) business area is setting up a business and technology centre (BTC) in Pune, India . This will support the delivery of an enhanced customer experience and drive innovation by building global capabilities at scale, using technology, and developing deep expertise . The BTC will be a core and connected part of our business, bringing together colleagues who report into their respective part of C&P, working together with other functions across bp. This is an exciting time to join bp and the customers & products BTC! Job Title: Process Specialist Data Senior SME About the role: As the Process Specialist Data for Castrol you will lead the design, governance, and sustainability of the Castrol Data Ecosystem across all major ERPs, source systems, and digital platforms. The role ensures strategic alignment with the Digital Business Strategy and drives transformation through agile methodologies. The Process Specialist Data acts as a domain expert, product owner, or scrum master depending on the scope and scale of initiatives! Key Accountabilities: Data Ecosystem Design & Lifecycle Management: Lead the design and continuous improvement of the Castrol Data Ecosystem, ensuring it is sustainable, scalable and aligned with the Data Management Framework, Data Standards and minimum design principles. Governance of the Data Management Framework: Supervise the repository covering data quality, pipelining, governance, modelling, compliance, and security across all systems and platforms. Strategic Data Challenge Resolution: Address data challenges across digital, MI, and analytics domains in collaboration with C&P, Technology, GBS, and Castrol’s PUs, HUBs, Functions, and Markets. Data Integration Leadership: Act as an integrator for internal and third-party data sources, ensuring alignment with the Castrol Data Fabric standards and principles and future proofing digital capabilities like data augmentation, predictive analytics, decision intelligence and AI. Collaborate with peers and support multi-functional teams Work across time zones and lead multi-disciplinary initiatives Approach: Apply a solutioning attitude to scale global to local and a fluent communicator Recommend data architecture strategies, continuous improvement opportunities, and capability/toolkit enhancements to the Digital Operational Excellence Manager and business collaborators. Experience and Qualifications: Education: Degree in an analytical field (preferably engineering) Experience: 10+ years of relevant experience in delivering data strategies and ETL transformations within major ERP and business transformation programs. Deep expertise in data modelling, lineage, normalisation, harmonisation, Data pipelines and process design. Good ability to translate data into actionable insights using queries, models, and Power BI. Confident communicator with the ability to craft compelling data narratives. Skills & Proficiencies: Strategic and problem solver with leadership capabilities. Expertise in ERP systems (SAP/R3, SAP/S4, JDE). Skilled in ERP data layer navigation and lineage assessment. Proficient in Power BI and data visualisation. Capable of working across multiple levels of detail: data lineage, normalisation, quality, security, process design, and systems architecture. Strong influencing and leadership skills, to be able to flex style, zoom in/out in leading junior and senior collaborators with different levels of expertise. Demonstrated success in multi-functional deployments and performance optimisation. Proven leadership skills and a track record of successful deployment across multiple areas, with a focus on input and output success criteria measures. BP Behaviours: Respect – Build strong, trust-based relationships through honest dialogue. Excellence – Apply standard methodologies, act professionally, and strive for executional excellence. One Team – Collaborate effectively and support team success. You will work with: You will be a part of 20 member Global Data & Analytics Team. You will operate peer to peer in a team of global seasoned experts on Process, Data, Advanced Analytics and Data Science. The Global Data & Analytics team reports into the Castrol Digital Enablement team that is managing the digital estate for Castrol where we enhance scalability, process and data integration! Travel Requirement Negligible travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is not available for remote working Skills: Agility core practices, Agility tools, Business Operations, Business process architecture, Business process control, Business process improvement, Commercial Acumen, Communication, Data Management, Data visualization and interpretation, Decision Making, Demand Management, Design Thinking, Goal Setting, Influencing, Lean Practices, Managing change, Managing Performance, Project and programme management, Stakeholder Engagement, Stakeholder Management, Strategic Thinking, Workload Prioritization Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 3 days ago

Apply

20.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Job Title: Senior QA Engineer Location: Bangalore Position Type: Full-time Position Level: 3 Who We Are Xactly is a leader in Sales Performance Management Solutions and a part of Vista Equity Partners portfolio companies since 2017. The Xactly Intelligent Revenue Platform helps businesses improve go-to-market outcomes through increased collaboration, greater efficiencies, and connecting data from all critical functions of the revenue lifecycle on a single platform. Born in the cloud almost 20 years ago, Xactly provides customers with extensive experience in solving the most challenging problems customers of all sizes face, backed by almost 20 years of proprietary data and award-winning AI. Named among the best workplaces in the U.S. by Great Place to Work six times, honored on FORTUNE Magazine’s inaugural list of the 100 Best Workplaces for Millennials, and chosen as the “Market Leader in Incentive Compensation” by CRM magazine. We’re building a culture of success and are looking for motivated professionals to join us! The Team Xactly’s QE team is a rapidly growing & very well diversified team with a very strong focus on cutting-edge test automation tools & technologies. We are a very strong team of 35+ members spread across our engineering centers in San Jose, Denver and Bangalore (India). All engineers in the QE team are encouraged to operate independently and with highest levels of accountability. Each QE engineer works with a tight-knit team of back-end developers, front-end developers, Product Managers in the scrum teams with laser focus on producing high quality code & products for our customers. All QE engineers are trained well on all aspects i.e. Products training, Automation tools & infrastructure, CI/CD etc. ensuring their success in scrum teams. Xactly QE team members work with the cutting edge tools & technologies like Selenium Web Driver, JAVA, TestNg, Maven, RestAssured, Jenkins, Docker, Kubernetes, Harness, Snowflake, Terraform , Jmeter to name a few. The Opportunity As a Senior QA Engineer at Xactly Corporation, you will maintain/continuously improve upon the QE function and facilitate implementation of QE best practices within the organization. Establish partnerships with internal stakeholders to understand customer requirements and ensure quality of delivery. Own, drive, measure and optimize the overall quality of the development and delivery process. Drive quality automation and take the customer perspective for end to end quality. At Xactly, we believe everyone has a unique story to tell, and these small differences between us have a big impact. When bright, diverse minds come together, we’re challenged to think different ways, generate creative ideas, be more innovative, and take on new perspectives. Our customers come from different cultures and walks of life all around the world, and we believe our teams should reflect that to build strong and lasting relationships. THE SKILL SETS : Experience of 5-8 years with strong automation testing skills. Strong testing skills with ability to develop test strategy, design test plan, and test cases effectively and independently. Strong experience in GUI automation (such as Selenium) and API automation (such as JUnit) using off the shelves tools Experience in testing enterprise J2EE business applications. Strong SQL query knowledge in Postgresql or Oracle database. Experience in Mabl Testing tool is a plus point. Strong Experience as QA engineer in Scrum methodology requiring automated tests as definition of done Programming experience in language such as Java Experience in product based companies Nice-to-have Skills (all Other Skills Can Be Added Here) Working on a team in a SAFe Portfolio and ART Exposure on ETL/analytics modules Exposure on builds and deployments tools like Jenkins Exposure to build and deployment tools like Harness/Jenkins & Maven Within One Month, You’ll Attend New Hire Training Learn the Dev and QE processes Participate in scrum development process Get to know your team Within Three Months, You’ll Learn Xactly’s SaaS technology stack To gain complete domain and Xactly Product knowledge. Taking ownership of a module/project Perform code reviews Within Six Months, You’ll Ensure best QE practices are being used Working on multiple functionalities and taking ownership of respective module automation Perform RCA’s on Production Escapes and ensure corrective actions are implemented Within Twelve Months, You’ll Help grow other engineers technically by pairing and developing other learning opportunities. Training new joiners and peers in automation. Continuously work on QE process improvements to maximize team effectiveness and efficiencies Benefits & Perks Paid Time Off (PTO) Comprehensive Health and Accidental Insurance Coverage Tuition Reimbursement XactlyFit Gym/Fitness Program Reimbursement Free snacks onsite(if you work in office) Generous Employee Referral Program Free Parking and Subsidized Bus Pass (a go-green initiative!) Wellness program OUR VISION: Unleashing human potential to maximize company performance. We address a critical business need: to incentivize employees and align their behaviors with company goals. OUR CORE VALUES: Customer Focus | Accountability | Respect | Excellence (CARE) are the keys to our success, and each day we’re committed to upholding them by delivering the best we can to our customers. Xactly is proud to be an Equal Opportunity Employer. Xactly provides equal employment opportunities to all employees and applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, pregnancy, sexual orientation, or any other characteristic protected by law. This means we believe in celebrating diversity and creating an inclusive workplace environment, where everyone feels valued, heard, and has a sense of belonging. By doing this, everyone in the Xactly family has the power to make a difference and unleash their full potential. We do not accept resumes from agencies, headhunters, or other suppliers who have not signed a formal agreement with us.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies