Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
The ideal candidate for the position will have the responsibility of designing, developing, and maintaining an optimal data pipeline architecture. You will be required to monitor incidents, perform root cause analysis, and implement appropriate actions to solve issues related to abnormal job execution and data corruption conditions. Additionally, you will automate jobs, notifications, and reports to improve efficiency. You should possess the ability to optimize existing queries, reverse engineer for data research and analysis, and calculate the impact of issues on the downstream side for effective communication. Supporting failures, data quality issues, and ensuring environment health will also be part of your role. Furthermore, you will maintain ingestion and pipeline runbooks, portfolio summaries, and DBAR, while enabling infrastructure changes, enhancements, and updates roadmap. Building the infrastructure for optimal extraction, transformation, and loading data from various sources using big data technologies, python, or web-based APIs will be essential. You will participate in code reviews with peers, have excellent communication skills for understanding and conveying requirements effectively. As a candidate, you are expected to have a Bachelor's degree in Engineering/Computer Science or a related quantitative field. Technical skills required include a minimum of 8 years of programming experience with python and SQL, experience with massively parallel processing systems like Spark or Hadoop, and a minimum of 6-7 years of hands-on experience with GCP, BigQuery, Dataflow, Data Warehousing, Data modeling, Apache Beam, and Cloud Storage. Proficiency in source code control systems (GIT) and CI/CD processes, involvement in designing, prototyping, and delivering software solutions within the big data ecosystem, and hands-on experience in generative AI models are also necessary. You should be able to perform code reviews to ensure code meets acceptance criteria, have experience with Agile development methodologies and tools, and work towards improving data governance and quality to enhance data reliability. EXL Analytics offers a dynamic and innovative environment where you will collaborate with experienced analytics consultants. You will gain insights into various business aspects, develop effective teamwork and time-management skills, and receive training in analytical tools and techniques. Our mentoring program provides guidance and coaching to every employee, fostering personal and professional growth. The opportunities for growth and development at EXL Analytics are limitless, setting the stage for a successful career within the company and beyond.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will play a crucial role as the Lead Analytics Consultant in the Data and Analytics Services team at Sun Life. Your primary responsibility will be to create innovative visual analytics solutions that facilitate faster and more informed decision-making processes within the organization. Your expertise will be instrumental in advancing strategic analytics projects that cater to the needs of our business stakeholders. To excel in this role, you should have a minimum of 5 years of experience in Tableau development, including designing dashboards and decision enablement tools. Proficiency in Tableau Desktop and Server platforms, as well as SQL Programming and PL/SQL, is essential. You will be required to showcase your skills by developing a diverse range of design use cases that prioritize user experience and working in an agile development environment. Ideally, you should hold a graduate degree in Mathematics, Computer Science, Engineering, or a related field to qualify for this position. Your responsibilities will involve demonstrating visual design expertise, creating complex calculations, and implementing advanced dashboard practices in Tableau. Additionally, you should be adept at data modeling, performance tuning of Tableau Server dashboards, and designing custom landing pages for enhanced user experience. Your ability to work independently, manage engagements with business partners, and extract meaningful insights from data will be key to your success in this role. Possessing additional skills such as Tableau desktop and server certification, experience with R/Python/Matlab/SAS, and exposure to Agile or design-driven development of analytics applications will be advantageous. If you are passionate about leveraging data and analytics to drive meaningful business outcomes, this role will provide you with an exciting opportunity to make a difference in the lives of individuals, families, and communities around the world.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Junior Data Analyst at our company located in Hyderabad, you will play a crucial role in our Data & Analytics team. We are looking for a motivated and intellectually curious individual with a solid foundation in data analysis, business intelligence, and critical thinking. Your responsibilities will include interpreting data, generating insights, and supporting strategic initiatives across various business units. You will collaborate with internal stakeholders to understand business requirements, work with large datasets, build dashboards, and deliver actionable insights that drive informed business decisions. Your key responsibilities will involve data extraction & transformation, reporting & dashboarding, data analysis, stakeholder collaboration, data quality & governance, as well as communication & documentation. To excel in this role, you must possess a Bachelor's degree in Computer Science, Mathematics, Statistics, Economics, Engineering, or a related field. You should have 2-3 years of hands-on experience in data analytics or business intelligence roles. Strong analytical thinking, proficiency in SQL, experience with ETL processes, and knowledge of Excel and data visualization tools like Tableau or Power BI are essential. Excellent communication skills, attention to detail, and the ability to manage multiple priorities and deadlines are also required. Preferred or bonus skills include exposure to scripting languages like Python or R, experience with cloud platforms and tools (e.g., AWS, Snowflake, Google BigQuery), prior experience in the financial services or fintech domain, and an understanding of data modeling and warehousing concepts. In return, we offer a collaborative, inclusive, and intellectually stimulating work environment, opportunities for learning and growth through hands-on projects and mentorship, and the chance to work with data that drives real business impact.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
Your journey at Crowe starts here with the opportunity to build a meaningful and rewarding career. At Crowe, you are trusted to deliver results and make an impact while having the flexibility to balance work with life moments. Your well-being is cared for, and your career is nurtured in an inclusive environment where everyone has equitable access to opportunities for growth and leadership. With over 80 years of history, Crowe has excelled in delivering excellent service through innovation across audit, tax, and consulting groups. As a Data Engineer at Crowe, you will provide critical integration infrastructure for analytical support and solution development for the broader Enterprise using market-leading tools and methodologies. Your expertise in API integration, pipelines or notebooks, programming languages (Python, Spark, T-SQL), dimensional modeling, and advanced data engineering techniques will be key in creating and delivering robust solutions and data products. You will be responsible for designing, developing, and maintaining the Enterprise Analytics Platform to support data-driven decision-making across the organization. Success in this role requires a strong interest and passion in data analytics, ETL/ELT best practices, critical thinking, problem-solving, as well as excellent interpersonal, communication, listening, and presentation skills. The Data team strives for an unparalleled client experience and will look to you to promote success and enhance the firm's image firmwide. To qualify for this role, you should have a Bachelor's degree in computer science, Data Analytics, Data/Information Science, Information Systems, Mathematics (or related fields), along with specific years of experience in SQL, data warehousing concepts, programming languages, managing projects, and utilizing tools like Microsoft Power BI, Delta Lake, or Apache Spark. It is preferred that you have hands-on experience or certification with Microsoft Fabric. Upholding Crowe's values of Care, Trust, Courage, and Stewardship is essential in this position, as we expect all team members to act ethically and with integrity at all times. Crowe offers a comprehensive benefits package to its employees and provides an inclusive culture that values diversity. You will have the opportunity to work with a Career Coach who will guide you in your career goals and aspirations. Crowe, a subsidiary of Crowe LLP (U.S.A.), a public accounting, consulting, and technology firm, is part of Crowe Global, one of the largest global accounting networks in the world. Crowe does not accept unsolicited candidates, referrals, or resumes from any staffing agency or third-party paid service. Referrals, resumes, or candidates submitted without a pre-existing agreement will be considered the property of Crowe.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Technical Lead specialized in Sterling Commerce, your role will involve designing and delivering order management and fulfillment solutions. With a minimum of 5 years of experience in Sterling Commerce applications, you will be responsible for leading the technical aspects of solutions design, performance impacts, and code reviews throughout the full software development lifecycle. Your expertise in multichannel order management, Object-Oriented Concepts, Java, J2EE, xml, xsd, xslt, and other related technologies will be crucial for this role. Additionally, you should possess knowledge about service definition framework, JMS queues, warehouse integration, pricing, contract, and data modeling for customers. Immediate joiners are preferred for this position, and a background in IT/Computers-Software industry along with a B.Sc/B.Com/M.Sc/MCA/B.E/B.Tech education is required. If you have at least 3 years of experience in a similar role and are keen on taking on new challenges in an innovative environment, we encourage you to apply for this exciting opportunity. Please send your application to jobs@augustainfotech.com if you meet the above requirements and are ready to contribute as a valuable member of our team.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
You should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field along with more than 10 years of professional experience in software development using .NET Core and C#. Your expertise should include designing and developing microservices architectures and hands-on experience with ASP.NET Core, Entity Framework Core, Web APIs, and RESTful services. A deep understanding of SQL Server, NoSQL databases, and data modeling is required, as well as experience with cloud platforms such as Azure or AWS, including deployment and monitoring. Moreover, you should possess a solid understanding of CI/CD pipelines, automated testing, and DevOps practices. Familiarity with frontend frameworks like Angular, React, or Vue is a plus. Strong problem-solving skills and the ability to lead complex technical projects are essential, along with excellent communication and interpersonal skills. Preferred qualifications include experience with containerization tools such as Docker, orchestration platforms like Kubernetes, and knowledge of message brokers such as RabbitMQ, Kafka, or Azure Service Bus. Experience with event-driven architecture and domain-driven design, as well as prior experience as a technical architect or team lead, are also desirable. Working experience in heterogeneous architecture will be an added advantage. Your key responsibilities will involve leading the architecture, design, development, and deployment of .NET Core-based applications and microservices. You will define technical standards, coding practices, and ensure best practices in software development. Collaborating with product owners, business analysts, and other architects to translate business requirements into scalable technical solutions is crucial. Additionally, you will provide technical leadership and mentorship to developers, conduct code reviews, and guide team skill development. Driving continuous improvement in system architecture, security, performance, and scalability is part of your role, along with evaluating and recommending new technologies, tools, and frameworks to enhance development efficiency. You will also ensure the integration of applications with various backend systems, databases, and third-party services, lead troubleshooting, and resolve complex technical issues. Active participation in sprint planning, estimation, and agile ceremonies is expected, as well as documenting architectural designs, development processes, and technical specifications.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
You are a highly analytical and results-driven Procurement Analyst sought to join a team. Your role will involve evaluating procurement and supply chain performance, identifying improvement opportunities, and providing data-driven insights to optimize efficiency, reduce costs, and enhance procurement strategy. Conducting in-depth data analysis to support strategic procurement decisions and cost-saving initiatives is a key responsibility. You will also be tasked with developing, automating, and maintaining dashboards and reports using tools like Power BI, Tableau, Python, Excel, or SQL. Evaluating supplier performance, analyzing purchasing patterns, and monitoring market trends to inform sourcing strategies are crucial aspects of your role. Monitoring key performance indicators (KPIs) such as cost reductions, PO cycle time, procurement ROI, and price competitiveness will be part of your duties. Collaboration with procurement, logistics, production, and finance teams to align on strategies is essential. Leading data modeling, spend analytics, and forecasting for procurement categories is also a responsibility. Ensuring procurement data accuracy, driving compliance with internal policies, and identifying process bottlenecks for operational efficiency improvements are key tasks. Providing actionable insights and reporting to leadership for continuous process enhancement is another important aspect of this role. To qualify for this position, you should hold a Bachelors or Masters degree in Supply Chain, Business Analytics, Economics, Engineering, or a related field. Preferably, degrees from US and UK universities are preferred. You should have at least 5 years of experience in procurement, sourcing analysis, supply chain management, or a data analytics role within supply chain functions. Proficiency in advanced data analytics tools and data visualization platforms is required. Strong knowledge of procurement processes, cost drivers, and supplier management principles is essential. Excellent communication skills to present data-driven insights to technical and non-technical audiences are necessary. Experience with procurement software or enterprise resource planning (ERP) systems is desirable. Join a global family of professionals driven by purposeful innovation to power the industry that powers the world. Through technical expertise, advanced equipment, and operational support, you will contribute to creating a lasting impact for customers and communities worldwide. Anticipating customer needs and delivering high-quality products and services on time and within budget is the ethos that guides our team's efforts.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
nagpur, maharashtra
On-site
At Datatype IT Consulting, we specialize in staffing and recruitment services that connect exceptional talent with leading organizations. Our mission is to align top professionals with opportunities that match their skills and career goals, ensuring a perfect fit for both candidates and employers. With a deep understanding of the IT industry and a commitment to excellence, we are a trusted partner in workforce solutions. Whether you're a company building a strong team or a candidate seeking the next career step, we are here to guide you. Discover how we can help you achieve your staffing and recruitment goals with precision and integrity. This is a full-time on-site role located in Nagpur for a Data Analyst. The Data Analyst will be responsible for analyzing data sets to identify trends, creating data models, and utilizing statistical methods to provide actionable insights. The role involves collaborating with other team members, interpreting complex data, and effectively communicating findings to stakeholders. Additionally, the Data Analyst will be expected to maintain data integrity and ensure accuracy in reporting. The ideal candidate for this role should possess strong analytical skills and expertise in data analytics. Proficiency in statistics and data modeling is required, along with excellent communication skills for interpreting and presenting data. Being detail-oriented with the ability to maintain data accuracy is essential. A Bachelor's degree in Data Science, Statistics, Computer Science, or a related field is mandatory. Experience with data visualization tools and software would be a plus, and previous experience in a similar role within the IT industry is beneficial.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be joining Birlasoft as a Genio OpenText ETL Developer. Birlasoft is a global leader in Cloud, AI, and Digital technologies, known for its domain expertise and enterprise solutions. As part of the CKA Birla Group, with over 12,000 professionals, Birlasoft is committed to building sustainable communities and empowering societies worldwide. As a Genio OpenText ETL Developer, you will play a crucial role in designing, developing, and maintaining ETL workflows to support data integration and migration projects. Your responsibilities will include designing, developing, and maintaining ETL processes using OpenText Genio. You will collaborate with business analysts and data architects to understand data requirements and translate them into technical specifications. Your role will involve implementing data extraction, transformation, and loading processes to integrate data from various sources. You will optimize ETL workflows for performance and scalability, perform data quality checks, ensure data integrity throughout the ETL process, and troubleshoot and resolve ETL-related issues. Additionally, you will document ETL processes, maintain technical documentation, and provide support and guidance to junior ETL developers. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. You must have proven experience as an ETL Developer, focusing on OpenText Genio, and possess strong knowledge of ETL concepts, data integration, and data warehousing. Proficiency in SQL, experience with database management systems, familiarity with data modeling and data mapping techniques, excellent problem-solving skills, attention to detail, and strong communication and teamwork abilities are essential for this position. Preferred qualifications include experience with other ETL tools and technologies and knowledge of Agile development methodologies.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a seasoned Business Analyst at Colruyt in Coimbatore, India, you will collaborate with our business and IT organizations to deliver business-led, technology-enabled solutions that address unique challenges and opportunities. Your role will involve analyzing processes, identifying improvements, and developing solutions to optimize business processes. You will be responsible for describing, testing, and implementing solutions while supporting the business partner in integrating these solutions within the organization. Your key responsibilities will include analyzing and modeling the to-be system, identifying opportunities for improvement, and defining functional and non-functional requirements. You will work closely with development teams to translate business requirements into technology solutions, prepare test cases, and document scenarios for solution evaluation. Additionally, you will play a crucial role in change management, user manual preparation, and ensuring stakeholder communication. To excel in this role, you are expected to have at least 8 years of industry experience with a minimum of 5 years in business analysis. You should be comfortable working with a global customer and user base in a collaborative environment, capable of making independent decisions, and proactive in identifying and addressing challenges. Proficiency in ALM tools such as JIRA, Confluence, Visio, and Lucid Chart is essential to maximize productivity. A customer-oriented approach, analytical thinking, and attention to detail are crucial traits for success in this role. While not mandatory, experience working with APIs, defining logical data models, or engaging in onsite-offshore delivery for complex IT projects would be advantageous. Proficiency in requirements elicitation practices, such as interviews, questionnaires, and user stories, is desirable. Additionally, the ability to guide discussions with business teams, architects, and developers to define IT solutions and processes is a key aspect of the role. In summary, as a Business Analyst at Colruyt, you will play a pivotal role in driving business process improvements, optimizing technology solutions, and fostering strong professional relationships across all levels and departments within the organization. Your adaptability, innovative thinking, and commitment to delivering high-quality solutions within defined timelines will be instrumental in achieving success in this role.,
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a skilled Full Stack MERN Developer to be part of our team in Pune. If you have a passion for working with the MERN stack and NestJS, and enjoy tackling complex business challenges in order to deliver scalable features, we would like to get in touch with you. Your responsibilities will include designing, developing, and maintaining web applications using MongoDB, Express/NestJS, React.js, and Node.js. You will be tasked with enhancing existing codebases and business logic, developing scalable APIs, services, and frontend components. It is important to write clean, modular, and well-documented code. You will also engage in Agile development processes, participate in code reviews, collaborate with product managers, designers, and fellow developers, as well as identify and address performance issues and bugs. To qualify for this position, you should have at least 1.5 to 7 years of experience working with full stack JavaScript/TypeScript applications. Proficiency in MongoDB, React.js, Node.js, and preferably NestJS is required. An understanding of RESTful API design, data modeling, and authentication is essential. Strong problem-solving skills, debugging capabilities, effective communication skills, and a collaborative approach are also necessary. Any familiarity with Docker, CI/CD tools, or microservices would be a bonus. In return, you can expect engaging projects with technical complexities, exposure to real-world systems and architectural concepts, a collaborative team environment that offers flexibility and autonomy, and a remote-first culture with adaptable work hours. There will be learning opportunities to foster growth and drive innovation. If you are ready to take on challenging projects and contribute to impactful applications, we are excited to hear from you. Please submit your resume to kirti.solanki@talenttrek.net.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are an experienced Oracle Data Integrator (ODI) Developer with extensive knowledge of Oracle GoldenGate, and you are joining our team. With over 5 years of hands-on experience, you excel in constructing, overseeing, and enhancing data integration processes. Your responsibilities include collaborating with large-scale data environments, ensuring seamless data migrations, and supporting real-time integration strategies. Your key responsibilities will involve designing, developing, and executing data integration solutions utilizing ODI and Oracle GoldenGate. You will be responsible for creating and managing ETL processes for data migration, transformation, and integration. Additionally, you will design real-time data replication setups using GoldenGate across various systems while optimizing and troubleshooting data pipelines for optimal performance and reliability. Collaborating with cross-functional teams to translate business requirements into technical solutions will be crucial. You will also coordinate with DBAs and infrastructure teams to ensure smooth integration and system performance, manage data warehousing, synchronization, and migration tasks while supporting GoldenGate replication for high availability, disaster recovery, and data synchronization. Ensuring scalability, performance, and security of integration configurations will be part of your role. You will develop technical documentation and provide training on ODI and GoldenGate processes, as well as support production environments and troubleshoot data integration issues as required. Your required skills and qualifications include having 5+ years of hands-on experience in ODI development and implementation, proven expertise in Oracle GoldenGate, strong command of SQL, PL/SQL, and scripting for data manipulation, solid understanding of data modeling, ETL architecture, and multi-system integration, familiarity with Oracle databases and data warehousing concepts, experience with various ODI components, proficiency in configuring GoldenGate for high availability and disaster recovery, excellent troubleshooting and optimization skills for data pipelines, experience handling complex data migration and synchronization tasks, and the ability to excel in a fast-paced, client-facing environment. Preferred skills include familiarity with other ETL tools such as Informatica and Talend, knowledge of Oracle Cloud Infrastructure (OCI) or other cloud platforms, certifications in ODI, GoldenGate, or other Oracle technologies, and experience with performance tuning in large-scale data integration projects. Educationally, you hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Possessing relevant Oracle certifications (ODI, GoldenGate) is considered a plus.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
indore, madhya pradesh
On-site
You will be working out of our Indore office 5 days a week and will be required to work an evening shift to support our North American team. Your role will involve supporting clients in exploring how Vena can enhance their financial processes. As a Consultant, you will play a crucial role in implementing the Vena product, guiding clients on best practices, and enabling them to achieve their goals. Your responsibilities will include configuring data models, financial templates, and reports, integrating data from customer systems, setting up automated workflows, and ensuring customer satisfaction and time-to-value metrics are met. You will actively engage in workshops with customers to gather requirements, communicate effectively to help customers adopt the product, collaborate with Project Managers to deliver projects on time and on budget, and demonstrate a proactive approach to learning and problem-solving. To excel in this role, you should hold a Bachelors of Commerce or Accounting degree with 1-2 years of work experience in Financial Planning & Analysis (FP&A)/Accounting or Software Implementation. You should have a passion for learning new technology, improving business processes, and demonstrating resourcefulness. Excellent communication skills in English, both verbal and written, are essential, along with the ability to work collaboratively in a team environment and engage with stakeholders at all levels. Proficiency in Microsoft Excel, data integration, data modeling, and problem-solving skills is required. Experience or willingness to learn database management (i.e., ETL, Oracle, SQL Server, etc.) and interest in AI-driven solutions would be beneficial in enhancing performance and driving efficiencies.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining Cliff IT Solutions as a Sr. Data Modeler & Data Analyst located in Hyderabad. Your primary responsibility will be to design and implement data models, ensure data quality, and establish robust data governance frameworks. This full-time on-site role involves creating data architecture, managing Extract Transform Load (ETL) processes, and collaborating with stakeholders to enhance data systems. Your expertise in data management principles will be crucial in improving information systems effectively. To excel in this role, you should possess skills in Data Governance, Data Quality, Data Modeling, and Data Architecture. Experience with Extract Transform Load (ETL) processes is essential, along with proficiency in analytical problem-solving. Strong communication and teamwork abilities are required to engage with stakeholders effectively. A degree in Computer Science, Information Technology, or related fields is preferred. Additional experience in the Identity Management and Security Governance domain, as well as with tools like Informatica, Teradata, Axiom, SQL, and Databricks, will be advantageous.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining the Analytics Engineering team at DAZN, where your primary responsibility will be transforming raw data into valuable insights that drive decision-making across various aspects of our global business. This includes content, product development, marketing strategies, and revenue generation. Your role will involve constructing dependable and scalable data pipelines and models to ensure that data is easily accessible and actionable for all stakeholders. As an Analytics Engineer with a minimum of 2 years of experience, you will play a crucial part in the construction and maintenance of our advanced data platform. Utilizing tools such as dbt, Snowflake, and Airflow, you will be tasked with creating well-organized, well-documented, and reliable datasets. This hands-on position is perfect for individuals aiming to enhance their technical expertise while contributing significantly to our high-impact analytics operations. Your key responsibilities will involve: - Developing and managing scalable data models through the use of dbt and Snowflake - Creating and coordinating data pipelines using Airflow or similar tools - Collaborating with various teams within DAZN to transform business requirements into robust datasets - Ensuring data quality through rigorous testing, validation, and monitoring procedures - Adhering to best practices in code versioning, CI/CD processes, and data documentation - Contributing to the enhancement of our data architecture and team standards We are seeking individuals with: - A minimum of 2 years of experience in analytics/data engineering or related fields - Proficiency in SQL and a solid understanding of cloud data warehouses (preference for Snowflake) - Familiarity with dbt for data modeling and transformation - Knowledge of Airflow or other workflow orchestration tools - Understanding of ELT processes, data modeling techniques, and data governance principles - Strong communication and collaboration skills Nice to have: - Previous experience in media, OTT, or sports technology sectors - Familiarity with BI tools such as Looker, Tableau, or Power BI - Exposure to testing frameworks like dbt tests or Great Expectations,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You have a minimum of 5 years of experience in the MSBI Product suite, particularly in Power BI and DAX. Your role involves data preparation for BI projects, understanding business requirements in a BI context, transforming raw data into meaningful insights using Power BI, and working with SSIS. You are skilled in requirement analysis, design, prototyping, and building enterprise models using Power BI desktop. Your responsibilities also include developing data models, OLAP cubes, and reports while applying best practices to the development lifecycle. You document source-to-target mappings, data dictionaries, and database designs, and identify areas for optimization in data flows. You have a good understanding of DAX queries in Power BI desktop and can create Power BI dashboards, reports, KPI scorecards, and transform manual reports. Additionally, you have experience in visualization, transformation, data analysis, and formatting. Your expertise extends to connecting to data sources, importing and transforming data for Business Intelligence, publishing, and scheduling Power BI reports. You are also involved in the installation and administration of Microsoft SQL Server. Knowledge of EBS Modules like Finance, HCM, and Procurement is considered an advantage. You excel in a fast-paced, dynamic, client-facing role, delivering high-quality work products to exceed expectations. Your leadership, interpersonal, prioritization, multi-tasking, problem-solving, and communication skills are exceptional. Your ability to thrive in a team-oriented environment, manage ambiguity, adapt to new technologies, and solve undefined problems make you a valuable asset.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As an Intermediate Excel VBA and Power BI Developer at BCE Global Tech, you will be a crucial part of automating operational processes and creating insightful dashboards to aid BCE's business units. Your role will involve close collaboration with stakeholders from finance, operations, and technology to streamline reporting, enhance data visibility, and support strategic initiatives. You will be responsible for: - Developing and maintaining robust Excel VBA macros to automate manual workflows and reporting processes. - Designing and deploying Power BI dashboards for real-time insights into operational and financial performance. - Collaborating with cross-functional teams to gather requirements and translate them into scalable technical solutions. - Integrating data from various sources (Excel, SQL Server, SharePoint, APIs) into Power BI for unified reporting. - Optimizing existing Excel tools and Power BI reports for performance, usability, and maintainability. - Ensuring data integrity, security, and compliance with BCE's data governance standards. - Documenting solutions and providing training/support to end users as necessary. Required Skills & Qualifications: - 3-5 years of hands-on experience in Excel VBA development. - Strong expertise in Power BI, including DAX, Power Query (M), and data modeling. - Proficiency in SQL and experience working with relational databases. - Experience integrating Excel and Power BI with enterprise data sources. - Strong analytical thinking and problem-solving skills. - Excellent communication and stakeholder management abilities. - Ability to work independently and collaboratively in a fast-paced, agile environment. Nice To Have: - Experience with Power Automate or other Microsoft Power Platform tools. - Familiarity with Agile/Scrum methodologies. - Exposure to telecom or enterprise operations is a plus. What We Offer: - Competitive salaries and comprehensive health benefits. - Flexible work hours and remote work options. - Professional development and training opportunities. - A supportive and inclusive work environment. - Access to cutting-edge technology and tools.,
Posted 1 week ago
15.0 - 24.0 years
20 - 35 Lacs
Hyderabad, Chennai
Work from Office
Senior Data SME Experience : 15+ years Experience in all 4 platforms, Business Platform, Technology platform, Enablement platform, Foundation Platform Business Perspective Experience as Principal Arch in leading organizations handling customers effectively. Advanced understanding of Enterprise Architecture principles. Well-versed with the end to end data management philosophies and governance processes fit to business. Deliver data solutions and integration pipelines in accordance with agreed organizational standards that ensure services are resilient, scalable and future-proof Collaborate with cross-functional teams to define and implement cloud strategies that align with business goals Participating in architectural discussions and decision-making processes to ensure alignment with business goals and best practices Working experience on Modern data platforms which involves Cloud & related technologies, Graph databases, Virtualization Technology Perspective Worked in multiple large distributed mission critical applications as an architect. Proven experience as architect and engineering lead in Data & Analytics stream. In-depth understanding of data structure principles and data platforms Used Architecture tools for modelling. Working experience on Modern data platforms which involves Cloud & related technologies Keeps track of industry trends and incorporates new concepts Advanced understanding of AI concepts , LLMS, RAG, Agentic apps Advance Understanding in Data visualization and Data pipelines. Excellent understanding of traditional and distributed computing paradigm. Experience in designing and building scalable data pipelines. Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies Data integration ETL tools like IBM DataStage, Talend and Informatica. Ingestion mechanism like Flume & Kafka. Data modelling Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Data visualization – Tools like Tableau Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an architect is mandatory Contributing to developing and maintaining cloud governance frameworks, policies, and procedures Proficient in integrating cloud services with existing on-premises systems and applications, understanding various integration techniques Enablement & Foundation Perspective Published white papers in recognized forums Knowledgeable in ROI and benefits analysis Very good presentation/Articulation skills Awareness of Security concerns while designing enterprise applications Creative problem-solver with strong communication skills Show an awareness of opportunities for innovation with new tools and uses of data They should also demonstrate strong decision-making abilities, prioritize tasks, and manage resources effectively to ensure successful project outcomes. Please share your updated profile - meenakshi.biradar@hcltech.com
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Analyst in Data Analytics, you will be responsible for leveraging Snowflake, Databricks, and Power BI to analyze data, create dashboards, and provide valuable insights to support data-driven decision-making. Your role will be based in Mumbai and may require occasional travel to client locations. This non-engineering position is ideal for individuals who are detail-oriented, proactive, and have a strong background in data analytics and business intelligence. Your key responsibilities will include analyzing and interpreting data using Snowflake and Databricks, designing impactful dashboards and visualizations with Power BI, collaborating with stakeholders to understand business requirements, identifying trends and opportunities for business improvements, ensuring data accuracy and consistency, and delivering clear reports to business leaders and teams. To excel in this role, you should have at least 3 years of hands-on experience in data analytics and business intelligence. Proficiency in Snowflake and Databricks using SQL and/or Python is essential, along with expertise in Power BI report building, DAX functions, and dashboard design. A solid understanding of data modeling, KPIs, data storytelling, and strong SQL skills are also required. Excellent communication, analytical thinking, and the ability to manage multiple tasks while working cross-functionally are key skills for success in this position. If you are a proactive and detail-oriented individual with a passion for data analytics, this role offers an exciting opportunity to leverage your skills and contribute to data-driven decision-making within the organization.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer, you will be responsible for building and maintaining data pipelines using Snowflake, SQL, and Python to establish a robust data platform for our team. Your role will involve collaborating with data analysts and data scientists to ensure data accuracy and security while optimizing data systems for performance. Your key responsibilities will include constructing data pipelines and ETL processes, working with Snowflake for data storage and optimization, writing and enhancing SQL queries, leveraging Python for data tasks, and monitoring data systems to uphold their efficiency. Your expertise in Snowflake, advanced SQL skills, proficiency in Python for data engineering, experience with ETL processes, and understanding of data modeling will be essential for this role. To qualify for this position, you must hold a Bachelor's degree in Computer Science or a related field, possess over 6 years of data engineering experience, demonstrate strong problem-solving and communication skills, exhibit the ability to collaborate effectively within a team, and ideally have experience in AI/ML. Additionally, being Snow pro certified Professional would be a plus. If you are passionate about data engineering and possess the required skills and qualifications, we invite you to join our team and contribute to the development of our data platform.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
As an SAP-Fieldglass Technical Configuration Lead at our organization, you will play a crucial role in designing, configuring, and implementing Fieldglass solutions, with a specific focus on Contingent Labor and Services modules. Your responsibilities will involve leading system builds, integrations, and ensuring alignment with business and compliance requirements. Your key tasks will include technical configuration, data modeling, workflow setup, and integration across SAP Fieldglass and related platforms such as SAP S/4HANA and Ariba. Serving as the technical lead, you will translate intricate requirements into scalable solutions, supporting global implementation and adoption efforts. Your responsibilities will include leading technical configuration and development of SAP Fieldglass, particularly concentrating on the Contingent Workforce and Services modules. You will oversee end-to-end design, build, testing, and deployment activities, configure and maintain various components, develop integrations, and ensure compliance with enterprise standards and regulations. Additionally, you will act as a subject matter expert on Fieldglass capabilities, document solution designs and configurations, lead user acceptance testing, troubleshoot issues, and provide progress reports to stakeholders and leadership. Moreover, you will support change management initiatives through training materials and knowledge transfer. **Basic Qualifications:** - Bachelor's degree in computer science, Engineering, Information Systems, or related technical field. - 5+ years of relevant experience with at least 2 years managing SAP Fieldglass implementations. - Proficiency in SAP Fieldglass modules and hands-on experience in configuration. - Strong collaboration, communication, and stakeholder engagement skills. - Ability to manage multiple priorities and work efficiently under tight timelines. **Preferred Qualifications:** - Familiarity with middleware and integration tools. - Experience in global deployments and regional compliance nuances. - Knowledge of SAP Business Network invoicing processes and vendor portal functionality. - Experience in system governance, audit support, and SOX compliance. - Understanding of contractor onboarding processes and vendor management. *Non-standard Work Schedule, Travel Or Environment Requirements:* Support for implementations may require work outside normal hours and occasionally during weekends. Pfizer is an equal opportunity employer and adheres to all applicable equal employment opportunity legislation in the jurisdictions it operates in.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Experience in developing digital marketing/digital analytics solutions using Adobe products is required for this role. You should have experience in Adobe Experience Cloud products and recent experience with Adobe Experience Platform or a similar CDP. Having good knowledge of the Data Science workspace and building intelligent services on AEP is essential. Strong familiarity with datasets in Adobe Experience Platform, including loading data into the platform through data source connectors, APIs, and streaming ingestion connectors, is necessary. Experience in creating all required Adobe XDM (Experience Data Model) in JSON based on the approved data model for all loading data files is a key requirement. Knowledge of utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment is expected. Experience in configuration within Adobe Experience Platform all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases is essential. Testing/validating the segments with the required destinations is also part of the role. Managing customer data using Real-Time Customer Data Platform (RTCDP) and analyzing customer data using Customer Journey Analytics (CJA) are important responsibilities. Experience with creating connections, data views, and dashboards in CJA is also required. Hands-on experience in the configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign, and Target is necessary. Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable. Proven ability to communicate verbally and in writing in a high-performance, collaborative environment is expected. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) is also important for this role. At Capgemini, you can shape your career with a range of career paths and internal opportunities within the Capgemini group. You will receive personalized career guidance from our leaders. Comprehensive wellness benefits, including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work, will be provided. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging. You are valued for who you are, and you can bring your original self to work. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. With a responsible and diverse group of 340,000 team members in more than 50 countries, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. The company delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
This is a full-time Data Engineer position with D Square Consulting Services Pvt Ltd, based in Pan-India with a hybrid work model. You should have at least 5 years of experience and be able to join immediately. As a Data Engineer, you will be responsible for designing, building, and scaling data pipelines and backend services supporting analytics and business intelligence platforms. A strong technical foundation, Python expertise, API development experience, and familiarity with containerized CI/CD-driven workflows are essential for this role. Your key responsibilities will include designing, implementing, and optimizing data pipelines and ETL workflows using Python tools, building RESTful and/or GraphQL APIs, collaborating with cross-functional teams, containerizing data services with Docker, managing deployments with Kubernetes, developing CI/CD pipelines using GitHub Actions, ensuring code quality, and optimizing data access and transformation. The required skills and qualifications for this role include a Bachelor's or Master's degree in Computer Science or a related field, 5+ years of hands-on experience in data engineering or backend development, expert-level Python skills, experience with building APIs using frameworks like FastAPI, Graphene, or Strawberry, proficiency in Docker, Kubernetes, SQL, and data modeling, good communication skills, familiarity with data orchestration tools, experience with streaming data platforms like Kafka or Spark, knowledge of data governance, security, and observability best practices, and exposure to cloud platforms like AWS, GCP, or Azure. If you are proactive, self-driven, and possess the required technical skills, then this Data Engineer position is an exciting opportunity for you to contribute to the development of cutting-edge data solutions at D Square Consulting Services Pvt Ltd.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a skilled Snowflake + Python + SQL Developer with 4-6 years of experience, ready to join a dynamic team. Your expertise lies in cloud data platforms, Python programming, and SQL database management. While experience with DBT (Data Build Tool) is a plus, it's not mandatory for this role. In this role, your primary responsibilities include designing, implementing, and managing data pipelines using Snowflake. You will also be developing and optimizing complex SQL queries for data extraction, transformation, and reporting, as well as handling large-scale data processing and integration using Python. Data modeling and optimization are crucial aspects of your role. You will develop and maintain Snowflake data models and warehouse architecture, optimizing data pipelines for performance and scalability. Collaboration is key as you work closely with cross-functional teams to understand data needs and provide efficient solutions. ETL development is another essential part of your role. You will develop and maintain ETL/ELT processes to support data analytics and reporting, utilizing Python scripts and Snowflake tools for data transformation and integration. Monitoring performance, troubleshooting issues, and ensuring data integrity are also part of your responsibilities. While leveraging DBT for data transformation within Snowflake is optional, it is considered advantageous. You may also develop and maintain DBT models to enhance the quality of data transformations. Your key skills and qualifications include hands-on experience with Snowflake, Python, and SQL, a strong understanding of SQL databases and data modeling concepts, and experience in building scalable data pipelines and ETL/ELT processes using Python and Snowflake. Having knowledge of data warehousing best practices, familiarity with cloud platforms such as AWS, Azure, or GCP, and an understanding of version control systems like Git are also beneficial for this role.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France