Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a LLM Engineer at HuggingFace, you will play a crucial role in bridging the gap between advanced language models and real-world applications. Your primary focus will be on fine-tuning, evaluating, and deploying LLMs using frameworks such as HuggingFace and Ollama. You will be responsible for developing React-based applications with seamless LLM integrations through REST, WebSockets, and APIs. Additionally, you will work on building scalable pipelines for data extraction, cleaning, and transformation, as well as creating and managing ETL workflows for training data and RAG pipelines. Your role will also involve driving full-stack LLM feature development from prototype to production. To excel in this position, you should have at least 2 years of professional experience in ML engineering, AI tooling, or full-stack development. Strong hands-on experience with HuggingFace Transformers and LLM fine-tuning is essential. Proficiency in React, TypeScript/JavaScript, and back-end integration is required, along with comfort working with data engineering tools such as Python, SQL, and Pandas. Familiarity with vector databases, embeddings, and LLM orchestration frameworks is a plus. Candidates with experience in Ollama, LangChain, or LlamaIndex will be given bonus points. Exposure to real-time LLM applications like chatbots, copilots, or internal assistants, as well as prior work with enterprise or SaaS AI integrations, are highly valued. This role offers a remote-friendly environment with flexible working hours and a high-ownership opportunity. Join our small, fast-moving team at HuggingFace and be part of building the next generation of intelligent systems. If you are passionate about working on impactful AI products and have the drive to grow in this field, we would love to hear from you.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You will be responsible for contributing to the development and continuous enhancement of a proprietary low-code or no-code platform for application development, ETL workflows, and data analytics. This will involve designing and implementing new features and functionalities using a microservices-based architecture with the MEAN stack, Python, and Elasticsearch. You will also be tasked with maintaining and optimizing existing components to ensure performance, scalability, and reliability. Collaborating with cross-functional teams to deliver robust and user-friendly solutions will be a key part of your role. Additionally, you will support and troubleshoot issues across the platform to ensure smooth operation and empower end-users by enabling application creation with minimal coding effort. Ensuring code quality through best practices, code reviews, and testing will also be part of your responsibilities. Furthermore, you will be expected to research and integrate new technologies to improve platform capabilities and performance. ViewZen Labs Private Limited is a DIPP-recognized start-up providing data collection and data analytics solutions. The company's platforms are designed to help stakeholders collect data quickly, visualize it, and benefit from it at very low costs. By letting clients focus on their core business while managing their data and providing actionable insights, the company aims to offer valuable solutions built on core technology platforms that combine deep industry research and domain expertise.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for a Lead Database Developer-Oracle to join our technology team at Clarivate. As the Lead Database Developer, you will oversee the design, development, and maintenance of high-performance databases utilizing Oracle and PostgreSQL. You should have a Bachelor's degree in computer science, Information Technology, or a related field, or equivalent experience. A minimum of 8 years of experience in Oracle database environments and PostgreSQL is required, along with expertise in database performance tuning, query optimization, and capacity planning. Additionally, you should have at least 4 years of experience in cloud-based database services like AWS RDS. A solid understanding of data security, backup, and recovery procedures is necessary, along with knowledge of relational database concepts such as primary/foreign keys, many-to-many relationships, and complex join operations. Experience in system analysis, design, problem-solving, support, and troubleshooting is also expected. Familiarity with cloud database platforms such as AWS RDS and Azure Cloud would be advantageous. It would be beneficial if you have an in-depth understanding of database architecture and data modeling principles, as well as good knowledge of No-SQL database solutions, AWS, and Azure Db solutions and services. In this role, you will collaborate with the development team to design and implement efficient database structures that meet the organization's requirements. You will develop and maintain database schemas, tables, views, stored procedures, and functions. Monitoring and analyzing database performance to identify and resolve bottlenecks or performance issues will be a key responsibility. You will optimize queries, indexes, data schemas, and database configurations to enhance system performance and ensure data security and integrity by implementing and maintaining database security measures. Additionally, you will develop and maintain data integration processes, including Extract, Transform, and Load (ETL) workflows, and create comprehensive documentation for database environments. Working with DevOps, you will develop and maintain database backup and recovery strategies to ensure data integrity and availability. You will be part of a Database performance team that operates horizontally across the Intellectual Property pillar at Clarivate. The team works with various database genres, both in the cloud and on-premise, and encourages the support and deep knowledge of other specialist databases. Collaboration with cross-functional teams, including developers, system administrators, and business stakeholders to understand their database requirements and provide technical support is essential. This is a full-time opportunity with Clarivate, requiring 9 hours of work per day, including a lunch break. Clarivate is committed to providing equal employment opportunities for all qualified individuals with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment, ensuring compliance with applicable laws and regulations governing non-discrimination in all locations.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As an experienced SQL Developer with a minimum of 3 years of relevant experience, you will be responsible for working with data storage and management. Your role will involve proficiency in MS SQL Server language, platform, and environment, including SSMS. You should have a Bachelor's or Master's degree in Computer Science, Information Systems, or a related discipline. Your expertise should include working with SQL Tables, data types, stored procedures, Views, Functions, and T-SQL. You should be skilled in Query Performance and Optimization, as well as have the ability to understand PL/SQL and develop/troubleshoot business logic migration into T-SQL. An understanding of relational data models and tools will be advantageous. Experience in developing ETL workflows, process improvement, and process automation related to database development will be a valuable addition to your skillset. You should also be able to collaborate effectively with clients, Database, Analyst, and Operations teams. If you are looking for a challenging role in SQL development, this opportunity offers a competitive salary package that is best in the industry.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Comprehensive health and life insurance and well-being benefits, based on location. Pension/Retirement benefits. Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role Enterprise Services comprises of multiple business platforms including Client Services, Global Business Operations, Business Architecture, Data Strategy and Analytics, and Digital Services, which report into the Chief of Enterprise Services. These grouped platforms enable the business to optimize delivery for clients, generate efficiencies and resilience, and enable consistency in the business digitization strategy, processes and end-to-end best practices. The skilled Automation Tester is experienced in testing applications developed in Appian, able to validate ETL workflows by querying and comparing result sets and has hands-on knowledge on testing applications developed using RPA tools like BluePrism. The Automation Tester is a self-starter with a strong ability to prioritize, own testing deliverables/timelines, understand various solution components, and clearly and effectively communicate results with the team. What You'll Do - Develop and execute test cases for applications developed in Appian, ensuring comprehensive coverage of both positive and negative scenarios. - Test workflows designed on Talend, focusing on data extraction, transformation, and loading processes. - Validate and verify automation (RPA) solutions developed using BluePrism, ensuring they meet business requirements and function as expected. - Gather and set up required test data for testing, ensuring data integrity and consistency. - Track test results and defects throughout the testing lifecycle, using tools like JIRA for defect management. - Coordinate with the user base for a successful roll-out during the user acceptance test phase, providing clear and concise feedback. - Independently manage multiple projects based on provided priorities to complete testing and provide feedback within given timelines. - Collaborate with other team members and analysts through the delivery cycle, ensuring seamless integration and communication. - Participate in an Agile delivery team that builds high-quality and scalable work products, contributing to sprint planning, reviews, and retrospectives. - Assist in the evaluation of upcoming technologies and contribute to the overall solution design, providing insights and recommendations. - Support production releases and maintenance windows, working closely with the Operations team to ensure smooth deployments. - Align risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately. Qualifications Bachelor's degree preferred or equivalent experience. Talents Needed For Success - Minimum of 6 years of related experience in testing automation solutions. - Ability to create Scripts using Python. - Hands-on experience with test automation tools like Selenium, TestComplete, and UFT One. - Experience in using tools like BluePrism, UiPath, and Power Automate. - Strong understanding of SDLC and legacy technologies like MS Access and mainframe systems. - Ability to write and execute SQL queries to validate test results in SQL Server databases. - Experience in testing solutions built on Appian, with a focus on process automation and workflow management.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
thane, maharashtra
On-site
Job Description As a Python Backend Engineer with exposure to AI engineering at Quantanite, you will be an integral part of our team responsible for building a scalable, cognitive data platform. Your role will involve designing and developing high-performance backend services using Python (FastAPI), developing RESTful APIs for data ingestion, transformation, and AI-based feature access, and collaborating closely with DevOps and data engineering teams to integrate backend services with Azure data pipelines and databases. Your primary responsibilities will include managing database schemas, writing complex SQL queries, and supporting ETL processes using Python-based tools. Additionally, you will be tasked with building secure, scalable, and production-ready services that adhere to best practices in logging, authentication, and observability. You will also implement background tasks and async event-driven workflows for data crawling and processing. In terms of AI engineering contributions, you will support the integration of AI models (NLP, summarization, information retrieval) within backend APIs. You will collaborate with the AI team to deploy lightweight inference pipelines using PyTorch, TensorFlow, or ONNX, and participate in training data pipeline design and minor model fine-tuning as needed for business logic. Furthermore, you will contribute to the testing, logging, and monitoring of AI agent behavior in production environments. To be successful in this role, you should have at least 3 years of experience in Python backend development, with a strong proficiency in FastAPI or equivalent frameworks. A solid understanding of RESTful API design, asynchronous programming, and web application architecture is essential. Additionally, you should demonstrate proficiency in working with relational databases (e.g., PostgreSQL, MS SQL Server) and Azure cloud services, as well as experience with ETL workflows, job scheduling, and data pipeline orchestration (Airflow, Prefect, etc.). Exposure to machine learning libraries (e.g., Scikit-learn, Transformers, OpenAI APIs) is a plus, along with familiarity with containerization (Docker), CI/CD practices, and performance tuning. A mindset of code quality, scalability, documentation, and collaboration is highly valued at Quantanite. If you are looking for a challenging yet rewarding opportunity to work in a collaborative environment with a focus on innovation and growth, we encourage you to apply to join our dynamic team at Quantanite.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
This is a full-time Data Engineer position with D Square Consulting Services Pvt Ltd, based in Pan-India with a hybrid work model. You should have at least 5 years of experience and be able to join immediately. As a Data Engineer, you will be responsible for designing, building, and scaling data pipelines and backend services supporting analytics and business intelligence platforms. A strong technical foundation, Python expertise, API development experience, and familiarity with containerized CI/CD-driven workflows are essential for this role. Your key responsibilities will include designing, implementing, and optimizing data pipelines and ETL workflows using Python tools, building RESTful and/or GraphQL APIs, collaborating with cross-functional teams, containerizing data services with Docker, managing deployments with Kubernetes, developing CI/CD pipelines using GitHub Actions, ensuring code quality, and optimizing data access and transformation. The required skills and qualifications for this role include a Bachelor's or Master's degree in Computer Science or a related field, 5+ years of hands-on experience in data engineering or backend development, expert-level Python skills, experience with building APIs using frameworks like FastAPI, Graphene, or Strawberry, proficiency in Docker, Kubernetes, SQL, and data modeling, good communication skills, familiarity with data orchestration tools, experience with streaming data platforms like Kafka or Spark, knowledge of data governance, security, and observability best practices, and exposure to cloud platforms like AWS, GCP, or Azure. If you are proactive, self-driven, and possess the required technical skills, then this Data Engineer position is an exciting opportunity for you to contribute to the development of cutting-edge data solutions at D Square Consulting Services Pvt Ltd.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled and experienced .NET Full Stack Developer with a minimum of 6 years of hands-on expertise in both backend and frontend development. You have robust experience with .NET/.NET Core, Microservices, ReactJS/NodeJS, and strong DevOps capabilities. This is a hybrid role across multiple Indian cities with a contract duration of 6 to 12 months. As a .NET Full Stack Developer, your key responsibilities include leading the design, development, testing, and deployment of scalable web applications using Microsoft technologies. You will collaborate in an Agile team environment to deliver end-to-end full stack solutions and build and maintain Microservices-based architectures. Developing clean, scalable code using .NET 8 or .NET Core and C#, implementing frontend solutions using ReactJS, Redux, Material UI, and Bootstrap (or NodeJS if applicable), and designing and managing robust CI/CD pipelines using Jenkins, GitHub/Bitbucket, and containerization technologies are essential parts of your role. You will utilize Azure or AWS services like Web Apps, AKS, Docker, Redis, Service Bus, App Insights, and more. Ensuring best practices in software engineering, participating in database design and performance tuning, and handling multi-shore delivery environments are also key responsibilities. To qualify for this position, you must hold a B.Tech / M.Tech in Computer Science or a related field and have at least 6 years of full stack expertise in both backend and frontend technologies. Strong working knowledge of Azure or AWS, Docker, Containers, AKS, Jenkins, Git, CI/CD pipelines, experience in relational databases and data design, and a deep understanding of Agile Scrum, Kanban, and Waterfall methodologies are required. Other skills such as LINQ, Entity Framework, RESTful APIs, Redux, CSS/SCSS, ETL workflows, React Native (preferred), and knowledge in Supply Chain systems (preferred) are desirable. Excellent problem-solving, communication, and documentation skills are essential. The ideal candidate for this role is technically sound, highly process-driven, possesses strong troubleshooting and analytical skills, is capable of independently owning technical solutions and leading implementations, and is ethical, principled, and effective in multi-cultural teams. A strong understanding of SDLC processes and Agile delivery is expected. This high-impact engineering role involves full lifecycle software development, close collaboration with distributed teams, and frequent engagement with DevOps practices and cloud services. You are expected to transform abstract business concepts into scalable technical solutions.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
The Smart Cube, a WNS company, is seeking Assistant Managers who will collaborate with the Project Lead to design effective analytical frameworks aligned with client objectives. The Assistant Managers will translate requirements into clear deliverables, manage data preparation, perform quality checks, and ensure analysis readiness. They should possess expertise in implementing analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN. Additionally, they are responsible for sanity checks, quality control, and interpreting results in a business context to identify actionable insights. Assistant Managers will independently handle client communications, interact with onsite leads, and manage the entire project lifecycle from initiation to delivery. This includes translating business requirements into technical specifications, overseeing data teams, ensuring data integrity, and facilitating communication between business and technical stakeholders. They will lead process improvements in analytics and act as project leads for cross-functional coordination. In terms of client management, Assistant Managers will serve as client leads, maintain strong relationships, participate in deliverable discussions, and guide project teams on execution strategies. Proficiency in connecting databases with Knime, understanding SQL concepts, and designing Knime ETL workflows to support BI tools is required. They must also be proficient in PowerBI for building dashboards and supporting data-driven decision-making. Knowledge of leading analytics projects using PowerBI, Python, and SQL to generate insights is essential. Ideal candidates should have 4-7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is also acceptable. Proficiency in handling large datasets using Python, R, or SAS, and experience with multiple analytics or machine learning techniques is required. Candidates should have a good understanding of consumer sectors such as Retail, CPG, or Telecom, and experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark. Strong written and verbal communication skills are essential for creating client-ready deliverables using Excel and PowerPoint. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting, and other ML models is also necessary. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications: - Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/Universities - MBA from top tier B-schools,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Engineer at our company, you will be an integral part of the team responsible for providing critical monitoring and support for our production data environment. You will work closely with internal and external teams to ensure operational stability, triage incidents, and maintain consistent coverage in a 7-day-a-week support role. Your responsibilities will include proactively monitoring and ensuring the successful execution of scheduled ETL jobs, troubleshooting and resolving issues in data pipelines and SQL environment, and coordinating with IT teams and vendors to address infrastructure-related issues. You will follow and maintain Run Books and SOPs, support data engineering tasks, and document detailed incident reports to ensure system uptime and minimize downtime. To excel in this role, you should have 3-5+ years of experience in MS SQL Server administration and development, strong skills in SSIS and T-SQL, and proven experience in supporting ETL workflows and handling production incidents. Familiarity with SQL Agent job monitoring, excellent communication and collaboration skills, and the ability to follow operational processes and escalation protocols are essential. This is a full-time position with benefits including a flexible schedule, health insurance, life insurance, paid time off, and the opportunity to work from home. The shift timing is from 9:30 AM to 5:30 PM IST, 7 days a week, with coverage provided by two developers working in rotation. If you are passionate about data engineering and have the required skills and experience, we encourage you to apply for this position. Required Skills: - 3-5+ years of experience in MS SQL Server administration and development - Strong proficiency with SSIS and T-SQL - Proven experience supporting ETL workflows and handling production incidents - Familiarity with SQL Agent job monitoring and logging - Excellent communication and collaboration skills - Ability to follow structured operational processes and escalation protocols Education: Bachelor's degree required Experience: 5 years of relevant work preferred Join us and be part of a dynamic team dedicated to maintaining the operational stability of our production data environment.,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
hyderabad, telangana
On-site
As an Intern at our company, you will be responsible for developing, testing, and maintaining data pipelines using PySpark and Databricks. Your role will involve assisting in the construction of ETL workflows and integrating data from various sources through Azure Synapse Analytics. It will be imperative for you to ensure data quality, integrity, and consistency across all systems. Additionally, you will have the opportunity to contribute to documentation and participate in the performance tuning of data solutions. Our company, FEG, stands as one of the largest omnichannel betting and gaming operators in Central and Eastern Europe. Being a digital-first business, technology is a fundamental component in how we engage with our customers, execute marketing campaigns, and oversee internal operations. Technology is intricately woven into the fabric of our operations. FEG India serves as our emerging technology hub, delivering top-notch solutions that bolster FEG's global operations. With a specialized team skilled in data management, business intelligence, analytics, AI/ML, software development, testing, and IT services, FEG India is pivotal in propelling innovation and excellence throughout the organization.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
As a member of our Non-Financial Risk team at Macquarie, you will be involved in embedding the Operational Risk Management Framework across the organization, encompassing financial regulatory reporting and financial statutory reporting risk. We are seeking individuals who have a keen interest in analytics and reporting within the realm of risk management. At Macquarie, we take pride in our ability to bring together a diverse group of individuals and empower them to explore a multitude of possibilities. Operating in 31 markets globally with 56 years of continuous profitability, we offer a supportive and collaborative environment where every team member, regardless of their role, is encouraged to contribute ideas and drive outcomes. Your primary responsibilities in this role will involve collaborating with regional and central teams to establish the Leadership Committee risk profile and providing insightful reporting on risk profiles utilizing data analytics. Additionally, you will be tasked with supporting the automation of existing reports and identifying opportunities for enhanced reporting through visualization tools and dashboard creation. To excel in this role, we are looking for individuals who possess expertise in data models, data warehousing, and segmentation techniques. Strong analytical skills with a keen attention to detail and accuracy are essential, along with proficiency in Business Intelligence tools such as Tableau and Power BI. Advanced experience in Excel, VBA, and SQL, including Impala, Starburst Pesto, and Hue, is highly desirable. Furthermore, the ability to design, develop, validate, and troubleshoot ETL workflows in Alteryx with a minimum of 2 years of experience is preferred. If you are inspired to contribute to building a better future with us and are enthusiastic about the role or the opportunity to work at Macquarie, we encourage you to apply and share your unique perspective with us. Financial Management, People and Engagement (FPE) serves as a consolidated interface for Macquarie's businesses across key areas of people, strategy, communications, and financial management. Comprising two pillars - Financial Management, and People and Engagement, FPE is responsible for overseeing the Group's financial, tax, and treasury activities, as well as strategic priorities. Additionally, it plays a crucial role in fostering our culture through people and community engagement strategies, while engaging with stakeholders to safeguard and enhance Macquarie's global reputation. At Macquarie, we are committed to promoting diversity, equity, and inclusion. We strive to provide reasonable accommodations to individuals who may require support during the recruitment process and in their working arrangements. If you need additional assistance, please do not hesitate to inform us during the application process.,
Posted 2 months ago
4.0 - 5.0 years
5 - 8 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities: Design, develop, and maintain ETL workflows and pipelines using Python Extract data from various sources (databases, APIs, flat files) and perform data transformations to meet business requirements Load processed data into target systems such as data warehouses, data lakes, or databases Optimize ETL processes for performance, scalability, and reliability Collaborate with data architects and analysts to understand data requirements and design solutions Implement data validation and error-handling mechanisms to ensure data quality Automate routine ETL tasks and monitoring using scripting and workflow tools Document ETL processes, data mappings, and technical specifications Troubleshoot and resolve issues in ETL workflows promptly Follow data governance, security policies, and compliance standards Required Skills: 4 to 5 years of hands-on experience in Python programming for ETL development Strong knowledge of ETL concepts and data integration best practices Experience with ETL frameworks/libraries such as Airflow, Luigi, Apache NiFi, Pandas , or similar Proficiency in SQL and working with relational databases (Oracle, MySQL, SQL Server, etc.) Familiarity with data formats like JSON, XML, CSV, Parquet Experience in cloud platforms and tools such as AWS Glue, Azure Data Factory, or GCP Dataflow is a plus Understanding of data warehousing concepts and architectures (star schema, snowflake schema) Experience with version control tools such as Git Knowledge of containerization (Docker) and CI/CD pipelines is desirable Preferred Qualifications: Experience working with big data technologies such as Hadoop, Spark, or Kafka Familiarity with NoSQL databases (MongoDB, Cassandra) Experience with data visualization and reporting tools Certification in Python or Data Engineering tools Knowledge of Agile methodologies and working in collaborative teams Soft Skills: Strong analytical and problem-solving skills Excellent communication and collaboration abilities Detail-oriented and committed to delivering high-quality work Ability to manage multiple tasks and meet deadlines Proactive and eager to learn new technologies and tools
Posted 2 months ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
The company Loyalytics is a rapidly growing Analytics consulting and product organization headquartered in Bangalore. They specialize in assisting large retail clients worldwide to capitalize on their data assets through consulting projects and product accelerators. With a team of over 100 analytics practitioners, Loyalytics is at the forefront of utilizing cutting-edge tools and technologies in the industry. The technical team at Loyalytics comprises data scientists, data engineers, and business analysts who handle over 1 million data points daily. The company operates in a massive multi-billion dollar global market opportunity and boasts a leadership team with a combined experience of over 40 years. Loyalytics has gained a strong reputation in the market, with word-of-mouth and referral-driven marketing strategies that have attracted prestigious retail brands in the GCC regions like Lulu and GMG. One of the key distinguishing factors of Loyalytics is its 10-year history as a bootstrapped company that continues to expand its workforce, currently employing over 100 individuals. They are now seeking a passionate and detail-oriented BI Consultant Tableau with 1-2 years of experience to join their analytics team. The ideal candidate for this role should have a solid foundation in SQL and hands-on expertise in developing dashboards using Tableau. Responsibilities include designing, developing, and maintaining interactive dashboards and reports, writing efficient SQL queries, collaborating with cross-functional teams, ensuring data accuracy, and optimizing dashboard performance. Strong analytical and problem-solving skills, along with good communication and documentation abilities, are essential for success in this position. Required skills and qualifications for the BI Consultant Tableau role at Loyalytics include 1-2 years of professional experience in BI/Data Analytics roles, proficiency in writing complex SQL queries, hands-on experience with Tableau Desktop, understanding of data modeling concepts and ETL workflows, familiarity with other BI tools like Power BI and Qlik, exposure to Tableau Server or Tableau Cloud, and knowledge of cloud platforms or databases such as AWS, GCP, Azure, Snowflake, or BigQuery. This is an exciting opportunity to join a dynamic and innovative team at Loyalytics and contribute to transforming data into valuable insights for clients in the retail industry.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, your primary responsibility will be to design and develop robust ETL pipelines using Python, PySpark, and various Google Cloud Platform (GCP) services. You will be tasked with building and optimizing data models and queries in BigQuery to support analytics and reporting needs. Additionally, you will play a crucial role in ingesting, transforming, and loading structured and semi-structured data from diverse sources. Collaboration with data analysts, scientists, and business teams is essential to grasp and address data requirements effectively. Ensuring data quality, integrity, and security across cloud-based data platforms will be a key part of your role. You will also be responsible for monitoring and troubleshooting data workflows and performance issues. Automation of data validation and transformation processes using scripting and orchestration tools will be a significant aspect of your day-to-day tasks. Your hands-on experience with Google Cloud Platform (GCP), particularly BigQuery, will be crucial. Proficiency in Python and/or PySpark programming, along with experience in designing and implementing ETL workflows and data pipelines, is required. A strong command of SQL and data modeling for analytics is essential. Familiarity with GCP services like Cloud Storage, Dataflow, Pub/Sub, and Composer will be beneficial. An understanding of data governance, security, and compliance in cloud environments is also expected. Experience with version control using Git and agile development practices will be advantageous for this role.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing and developing robust ETL pipelines using Python, PySpark, and Google Cloud Platform (GCP) services. Your role will involve building and optimizing data models and queries in BigQuery for analytics and reporting purposes. You will also be responsible for ingesting, transforming, and loading structured and semi-structured data from various sources. Collaboration with data analysts, scientists, and business teams to comprehend data requirements will be a key aspect of your job. Ensuring data quality, integrity, and security across cloud-based data platforms is crucial. Monitoring and troubleshooting data workflows and performance issues will also be part of your responsibilities. Automation of data validation and transformation processes using scripting and orchestration tools will be an essential aspect of your role. You are required to have hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark are necessary for this position. Your experience in designing and implementing ETL workflows and data pipelines will be valuable. Proficiency in SQL and data modeling for analytics is required. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer is preferred. Understanding data governance, security, and compliance in cloud environments is essential. Experience with version control tools like Git and agile development practices will be beneficial for this role. If you are looking for a challenging opportunity to work on cutting-edge data engineering projects, this position is ideal for you.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Informatica Specialist with expertise in version 10.2 and above, you will be responsible for designing and developing Informatica ETL workflows. Your role will involve configuring Informatica ETL mappings and sessions, monitoring Informatica workflows and sessions, creating reports, and visualizing data. Additionally, you will be involved in system testing and automating processes to ensure efficiency and accuracy. Your contributions will play a crucial role in the successful implementation of ETL solutions within the organization.,
Posted 2 months ago
3.0 - 7.0 years
3 - 7 Lacs
Gurgaon, Haryana, India
On-site
This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 2 months ago
5.0 - 10.0 years
10 - 20 Lacs
Vijayawada
Work from Office
We're Hiring: Business Process Analyst Data & Governance Focus Location: Vijayawada Experience: 5-10 Years | Type: Full-Time | Industry: Business Process Management, Data Governance, Public Sector Projects Are you skilled in mapping and optimizing complex workflows? Do you have experience aligning business processes with large-scale data systems? We're looking for a detail-oriented Business Process Analyst to join our growing team. Role Overview As a Business Process Analyst, you'll work closely with data, governance, and technical teams to improve process efficiency, ensure consistency in data handling, and drive cross-functional automation initiatives. Key Responsibilities Analyze, document, and optimize business processes related to data management and data lakes Develop and maintain Standard Operating Procedures (SOPs) for metadata creation and data dictionary usage Oversee workflows related to data receipt, cleaning, normalization, and ETL automation Identify and resolve bottlenecks in data collection, interoperability, and sharing mechanisms Required Skills Expertise in business process modeling and workflow automation Experience with data governance frameworks and ETL workflows Understanding of government/public sector workflows and data policies Proficiency in tools like Microsoft Visio, Lucidchart, or enterprise process mapping platforms Preferred Certifications Lean Six Sigma (Green Belt or Black Belt) CBPP Certified Business Process Professional PRINCE2 or PMP – Project Management Certifications
Posted 2 months ago
4.0 - 7.0 years
4 - 7 Lacs
Navi Mumbai, Maharashtra, India
On-site
Power BI Developer Company: Kiya.ai Role & Responsibilities: As a Power BI Developer , you will play a crucial role in supporting the MI CMA team by creating impactful Power BI Dashboards and identifying key performance indicators (KPIs) for senior management presentations. You will be responsible for the end-to-end development of BI solutions, from understanding business requirements to designing, developing, and deploying interactive dashboards. Business Domain: Support the MI CMA team in creating Power BI Dashboards . Identify and report KPIs suitable for senior management presentations. Be a good team-player with a strong learning attitude, capable of performing under strict timelines and demanding assignments. Play a key role in communicating with stakeholders daily, understanding requirements, preparing user story points/documentation, and collaborating with IT. Possess experience working in an Agile environment . Technical Domain: Design and develop ETL workflows and datasets in Power BI . Demonstrate solid, deep experience with data extract, transform, and load (ETL) tools and data visualization tools. Develop self-service models and data analytics using Power BI service . Design and deploy rich graphic visualizations with drill-down and drop-down menu options, and parameterize them using Power BI. Create dashboards and reports within Power BI based on written requirements. Ability to interact with business users and effectively understand their requirements. Communicate complex topics to the team through both written and oral communications. Ensure proper documentation and maintain Power BI dashboards. Conduct UAT (User Acceptance Testing) and provide sign-off on new developments. Skill Set: Business (Mandatory): Basic Understanding of Investment Banking Products . Technical (Mandatory): Power BI (Mandatory). Strong understanding of M, DAX, SQL, MDX Queries . Strong understanding of Power BI connectivity with different sources like Denodo, Oracle databases. Ability to create and manage Power BI Dashboards from scratch, as well as maintain/support existing dashboards. Desired Technical Skills: Alteryx Python Qualifications: Bachelor's Degree, preferably in Commerce / Bachelor's in Technologies / MBA in Finance IT Qualification.
Posted 2 months ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant - Alteryx & BI Responsibilities . Demonstrate an understanding of technology and digital frameworks in the context of data integration. . Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts. . Create technical specs, code and configuration . Ability to design and develop ETL workflows and datasets using Alteryx Designer . Design and develop ETL workflows and datasets in Alteryx to be used by the BI Reporting tool ETL specific for creating datasets to be used by Tableau Data Extract for BI . Hands on experience in Alteryx Designer, Server and tools like Prescriptive, interactive, Parse and different transform logics . Hands-on skills and experience in Alteryx Designer, Alteryx Server, and the tools with Alteryx such as Predictive, Parsing, and Transforms . Should be able to perform data blending, joining, and parsing the data, data lineage. . Should be able to create reports, render it, and create layouts. . Ability to work with Alteryx tools, macros, analytical apps on top of the workflows and create chained apps, standard, batch and iterative tools . Experience in writing SQL queries against any RDBMS with query optimization . Experience in Debugging issues, testing, and working with stakeholders on errors. . Experience in creating curated layer , Standardize workflow in Alteryx and document. . Design, build, and test new workflows with existing stakeholders. . Qualifications we seek in you! Minimum qualification . BE/B- Tech, BCA, MCA, BSc/MSc, MBA . Adaptive to AGILE methodology and SCRUM experience . Strong knowledge and experience of SQL . Understanding of Dimensional Models . Experience with development and production support Behavioral / team skills . Personal drive and positive work ethic to deliver results within tight deadlines and in demanding situations. . Flexibility to adapt to a variety of engagement types, working hours and work environments and locations. . Excellent communication and negotiation skills Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
0.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Manager - WD Adaptive In this role you will be responsible for Workday adaptive development. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big-data technologies Key Responsibilities: . Workday Adaptive development . Prepare High Level Design and ETL design . Creation and support of batch and real-time data pipelines built on AWS technologies including Glue, Redshift/Spectrum, Kinesis, EMR and Athena . Able to design AWS ETL workflows and ETL mapping. . Maintain large ETL workflows, Review and test ETL programs . Experience in AWS Athena and Glue Pyspark , EMR, DynamoDB, Redshift, Kinesis, Lambda, Snowflake Qualifications we seek in you! Minimum Qualifications Education: Bachelor&rsquos degree in computer science, Engineering, or a related field (or equivalent experience) Relevant yea r s of experience in Workday Adaptive development Experience in Creation and support of batch and real-time data pipelines built on AWS technologies including Glue, Redshift/Spectrum, Kinesis, EMR and Athena Experience in preparing High Level Design and ETL design. Maintain large ETL workflows, Review and test ETL programs. Preferred Qualifications/ Skills . Proficient in AWS Redshift, S3, Glue, Athena, DynamoDB . Should have experience in python, java . Should have performed ETL developer role in at least 3 large end to end projects . Should have good experience in performance tuning of ETL programs, debugging . Should have good experience in database , data warehouse concepts ,SCD1,SDC 2, SQLs Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 months ago
4.0 - 6.0 years
10 - 11 Lacs
Remote, , India
On-site
4+ years experience working with Statistica Application. 2+ years experience working in Tableau or Power BI. 4+years experience working with SQL queries. Experience working with VB scripts. Experience working various database management systems including oracle, sql server, PI. Experience working in a Life Sciences regulated environment and strong command of software validation and GxP processes. Ability to work independently as well as collaborate within a team. Develop ETL workflows in Statistica. Develop report templates to support automated reports within Statistica Support installing hotfixes and perform enhancements to support better data access. Create/update data configurations and queries. Troubleshooting issues reported by business users around system usage. Create knowledge articles to address FAQs Build or update tableau dashboards to support analytics and visualization needs. Work with business to verify the dashboard contents. Build solutions in Power BI to support business intelligence needs.
Posted 3 months ago
3.0 - 5.0 years
5 - 9 Lacs
Chennai
Work from Office
Design, develop, and maintain Power BI dashboards and interactive reports for cross-functional departments (e.g., Sales, Operations, HR, Finance). Connect, transform, and model data from both online sources (APIs, cloud platforms, Databricks, databases) and offline sources (Excel, CSV, etc.). Integrate data from Databricks , MySQL , and SparkSQL for comprehensive analytics and visual storytelling. Handle large-scale structured and unstructured data with high performance and efficiency. Automate recurring reporting processes using Power BI Service, Dataflows, and scheduled refreshes. Develop reusable and scalable data models , datasets , and report templates . Write efficient DAX and Power Query (M) expressions to support complex business logic. Ensure dashboards meet user needs and are optimized for performance and usability. Collaborate with business users and technical teams to gather requirements and deliver insights. Maintain organization data governance, security, and compliance standards across all BI solutions. Role & responsibilities Preferred candidate profile Experience with Azure Data Factory , Azure Synapse , or similar data orchestration tools. Knowledge of Git-based version control and CI/CD pipelines for BI deployments. Microsoft certifications (e.g., DA-100 / PL-300 / Azure Data Engineer). 35 years of hands-on experience developing dashboards using Power BI . Strong practical knowledge of Databricks , MySQL , and SparkSQL . Proven experience working with large datasets , including structured (relational DBs) and unstructured (logs, JSON, files). Expertise in connecting to both online (cloud/real-time) and offline (local/file-based) data sources. Proficiency in DAX , Power Query (M) , and advanced data modeling. Strong understanding of data architecture , ETL workflows , and BI best practices . Excellent communication skills to interact with stakeholders across departments. Ability to work independently and manage multiple dashboard/reporting projects simultaneously.
Posted 3 months ago
2.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Design, develop, deploy ETL workflows mappings using Informatica PowerCenter Extract data from various source systems transform/load into target systems Troubleshoot ETL job failures resolve data issues promptly. Optimize and tune complex SQL queries Required Candidate profile Maintain detailed documentation of ETL design, mapping logic, and processes. Ensure data quality and integrity through validation and testing. Exp with Informatica PowerCenter Strong SQL knowledge Perks and benefits Perks and Benefits
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |