Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
vadodara, gujarat
On-site
The role supports the integration, transformation, and delivery of data using tools within the Microsoft Fabric platform. You will collaborate with the data engineering team to provide Data and Insights solutions, ensuring the delivery of high-quality data to enable analytics capabilities within the organization. Your key responsibilities will include assisting in the development and maintenance of ETL pipelines using Azure Data Factory and other Fabric tools. You will work closely with senior engineers and analysts to gather requirements, develop prototypes, and support data integration from various sources. Additionally, you will play a role in developing and maintaining Data Warehouse schemas, contributing to documentation, and participating in testing efforts to uphold data reliability. It is crucial to learn and adhere to data standards and governance practices as directed by the team. Essential skills and experience for this role include a solid understanding of data engineering concepts and data structures, familiarity with Microsoft data tools like Azure Data Factory, OneLake, or Synapse, and knowledge of ETL processes and data pipelines. The ability to work collaboratively in an Agile/Kanban team environment is essential. Possessing a Microsoft certified Fabric DP-600 or DP-700 certification, along with any other relevant Azure Data certification, is advantageous. Desirable skills and experience include familiarity with Medallion Architecture principles, exposure to MS Purview or other data governance tools, understanding of data warehousing and reporting concepts, and an interest or background in retail data domains.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
jalandhar, punjab
On-site
Secuneus Technologies is an independent registered company based in Jalandhar, India, specializing in core "Cyber Security" to help businesses protect themselves against the latest cyber threats. We offer end-to-end security consultancy solutions and assist organizations with cyber security compliance, including ISO 27001. Our team of qualified cyber security specialists focuses on delivering results at a fair price, without pushy sales staff or non-technical account managers. This is a full-time on-site role for a Data Engineer at Secuneus Tech in Jalandhar. As a Data Engineer, you will be responsible for tasks such as data engineering, data modeling, ETL (Extract Transform Load), data warehousing, and data analytics. The ideal candidate should possess Data Engineering and Data Modeling skills, ETL (Extract Transform Load) expertise, Data Warehousing and Data Analytics proficiency, experience in working with large datasets, strong problem-solving and analytical skills, proficiency in SQL and database management systems, knowledge of programming languages like Python or Java, and a Bachelors degree in Computer Science, Data Engineering, or related field.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
NTT DATA is looking for a Senior Adobe Experience Platform (AEP) Developer to join their team in Chennai, Tamil Nadu, India. As a Senior AEP Developer, you will be responsible for supporting, maintaining, designing, developing, and implementing customer data solutions within the AEP ecosystem. Your role will involve building and managing data pipelines, integrating data from various sources, and creating custom solutions to address specific business needs. Key Responsibilities: - Utilize deep knowledge of the AEP suite, Experience Data Model (XDM), Data Science Workspace, and other relevant modules. - Proficiently work with AEP APIs, Web SDKs, and integrate with other MarTech platforms like Adobe Target, CJA, AJO, and Adobe Campaign. - Manage data ingestion, transformation, and activation within the AEP environment. - Implement best practices in data modeling and ensure data accuracy, consistency, and security. - Design, develop, and maintain high-quality data pipelines using AEP and other technologies. - Develop custom solutions using scripting languages (e.g., JavaScript, Python) and troubleshoot data quality issues. - Collaborate with cross-functional teams to support data-driven solutions, improve customer experience, and drive business growth. - Provide insights by analyzing data trends and effectively communicate technical concepts to different audiences. - Stay updated on advancements in AEP and related technologies to continuously expand knowledge. Qualifications: - Education: Bachelor's degree in computer science, engineering, or a related field (or equivalent experience). - Experience: Minimum 5 years of overall IT experience with 3-4 years of hands-on experience in Adobe Experience Platform (AEP). - Technical Skills: Proficiency in JavaScript, RESTful APIs, JSON, XML, data warehousing, data modeling, and tag management systems like Adobe Launch. - Soft Skills: Strong analytical, problem-solving, communication, interpersonal, collaboration, organizational, and time-management skills. Bonus Points: - Experience with Adobe Marketing Cloud solutions, Agile development methodologies, data visualization tools, and data governance/compliance. - Understanding of Real-time Customer Data Platform (RT-CDP) is a plus. About NTT DATA: NTT DATA is a trusted global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. With a diverse team across 50+ countries, NTT DATA offers services in consulting, data and AI, industry solutions, applications, infrastructure, and connectivity. As a leader in digital and AI infrastructure, NTT DATA, part of NTT Group, invests significantly in research and development to support organizations in their digital transformation journey. Visit us at us.nttdata.com.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
faridabad, haryana
On-site
We are seeking a skilled QA / Data Engineer with 3-5 years of experience. As the ideal candidate, you will possess expertise in manual testing and SQL, along with knowledge in automation and performance testing. Your primary responsibility will be to ensure the quality and reliability of our data-driven applications through comprehensive testing and validation. Key Responsibilities: - Utilize extensive experience in manual testing, particularly in data-centric environments. - Demonstrate strong SQL skills for data validation, querying, and testing database functionalities. - Implement data engineering concepts, including ETL processes, data pipelines, and data warehousing. - Work with Geo-Spatial Data to enhance data quality and analysis. - Apply QA methodologies and best practices for software and data testing. - Utilize effective communication skills for seamless collaboration within the team. Desired Skills: - Experience with automation testing tools and frameworks (e.g., Selenium, JUnit) for data pipelines. - Proficiency in performance testing tools (e.g., JMeter, LoadRunner) to evaluate data systems. - Familiarity with data engineering tools and platforms (e.g., Apache Kafka, Apache Spark, Hadoop). - Understanding of cloud-based data solutions (e.g., AWS, Azure, Google Cloud) and their testing methodologies. Qualifications: - Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) In this role, you will play a crucial part in ensuring the quality of our data-centric applications by conducting thorough testing and validation processes. Your expertise in manual testing, SQL, ETL processes, data pipelines, data warehousing, and additional skills in automation and performance testing will be key to your success. Join our team in Bengaluru/Gurugram and contribute to the reliability and efficiency of our data-driven solutions.,
Posted 2 days ago
5.0 - 10.0 years
0 Lacs
haryana
On-site
Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 1000 companies. With a presence in more than 45 countries across five continents, Evalueserve excels in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate clients" business impact and strategic decision-making. The team at Evalueserve consists of 4,500+ talented professionals operating across 45 countries, spanning India, China, Chile, Romania, the US, Canada, as well as emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in multiple countries, Evalueserve offers a dynamic, growth-oriented, and open culture that prioritizes flexible work-life balance, diverse and inclusive teams, and equal opportunities for all. Evalueserve University has been established to design and deliver a Global Learning Strategy integral to business success. From new graduate onboarding to leadership skill development workshops and domain-specific knowledge programs, the university strengthens structured learning opportunities. As a member of the Evalueserve team, you will partner with business leaders and subject matter experts to design and deliver learning programs focused on developing key competencies in Data Science. You will manage, enhance, and deliver signature boot-camp training programs for new hires globally and collaborate with Operations to identify training needs for on-role analysts. Additionally, you will create new learning content, monitor training quality and effectiveness, and ensure alignment with business needs. To excel in this role, you should have 5-10 years of experience in Data Science, along with proficiency in Cloud Fundamentals (Azure/AWS/GCP), Python, Machine Learning, LLM Ops, MLOps, Project Management, and NLP. Advanced skills in Computer Vision, Deep Learning, and other frameworks are desirable. Experience in Data Warehousing/Data Engineering/Big Data is an added advantage. A graduation degree is a prerequisite, along with excellent communication and interpersonal skills, and proficiency in Excel and PowerPoint. If you are ready to elevate your impact through innovation and learning, Evalueserve welcomes you to join its dynamic team and contribute to its mission of delivering cutting-edge solutions to clients worldwide.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Senior Data Engineer at FIS, you will have the opportunity to work on challenging issues in financial services and technology. The retirement market is highly competitive, requiring firms to deliver superior customer value and solve complex problems efficiently. You will play a crucial role in designing, developing, testing, and maintaining architectures such as databases and large-scale processing systems. Your responsibilities will include identifying and resolving database performance issues, overseeing ETL pipelines, developing testable code in SQL and Python, and communicating complex processes to clients. You will provide technical leadership, project management, and mentorship to junior engineers. Additionally, you will develop data processes for modeling, mining, and production, recommend ways to improve data reliability and quality, and design scalable data pipelines using AWS services. You will collaborate with data scientists and stakeholders to understand data requirements, implement data security measures, and maintain data integrity. Monitoring and troubleshooting data pipelines for optimal performance, optimizing data warehouse architectures, and creating comprehensive documentation for data engineering processes will also be part of your role. With over 10 years of relevant experience, you will design, code, and test major features, ensuring compliance with coding best practices and predefined processes. Your skills in AWS cloud services, programming languages, SQL database design, data pipeline orchestration, big data tools, and ETL will be essential. Strong analytical, communication, and project management skills are required to lead and manage effectively under pressure. A degree or equivalent qualification is expected, along with fluency in English and the ability to discuss technical solutions with internal and external parties. You should be detail-oriented, organized, and able to work both autonomously and as part of a global team. FIS offers a competitive salary, benefits, and numerous career development opportunities for you to grow and excel in your role. Join FIS, the world's largest global provider dedicated to financial technology solutions, and be part of a team that powers billions of transactions annually across the globe. With a history spanning over 50 years, FIS serves clients in over 130 countries, providing innovative solutions for the financial services industry. Your role as a Senior Data Engineer will contribute to our mission of delivering cutting-edge technology solutions to our clients. If you are a self-starter with a team mindset, looking for a multifaceted job with a high degree of responsibility and opportunities for personal development, FIS is the final career step for you.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a Data Engineer at GreyOrange, you will be responsible for designing, developing, and maintaining ETL pipelines to ensure efficient data flow for high-scale data processes. Your primary focus will be on managing and optimizing data storage and retrieval in Google BigQuery, while ensuring performance efficiency and cost-effectiveness. Additionally, you will be setting up quick analytics dashboards using tools like Metabase, Looker, or any other preferred platform. Collaboration with internal analysts and stakeholders, including customers, is key in understanding data needs and implementing robust solutions. You will play a crucial role in monitoring, troubleshooting, and resolving data pipeline issues to maintain data integrity and availability. Implementing data quality checks and maintaining data governance standards across the ETL processes will also be part of your responsibilities. Automation will be a significant aspect of your role, where you will develop scripts for repetitive tasks and optimize manual processes. Documenting the entire analytics implementation and data structures will ensure a guide for all users. Staying updated with industry best practices and emerging technologies in data engineering and cloud-based data management is essential. In addition to the responsibilities, the following requirements are necessary for this role: - 4+ years of experience as a Data Engineer or in a similar role - Strong experience with ETL tools and frameworks such as Apache Airflow, Dataflow, Estuary - Proficiency in SQL and extensive experience with Google BigQuery - Setting up analytics dashboards using tools like Looker, Metabase - Knowledge of data warehousing concepts and best practices - Experience with cloud platforms, particularly Google Cloud Platform (GCP) - Strong analytical and problem-solving skills focusing on cost optimization in cloud environments - Familiarity with Python or other scripting languages for automation and data processing - Excellent communication skills and the ability to work collaboratively in a team environment - Experience with data modeling and schema design As a Data Engineer at GreyOrange, you will have the opportunity to provide guidance and mentorship to junior data engineers and data analysts when needed.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The role is seeking a dynamic individual to join the M&R Sales Tech team, bringing expertise in software development of ETL and ELT jobs for the data warehouse software development team. This position plays a crucial role in defining the Design and Architecture during the migration from legacy SSIS technology to cutting-edge cloud technologies such as Azure, Databricks, and Snowflake. The ideal candidate will possess a robust background in Software Architecture, data engineering, and cloud technologies. Key Responsibilities: Architectural Design: Design and implement data architectures of ETL, including creating algorithms, developing data models and schemas, and setting up data pipelines. Technical Leadership: Provide technical leadership to the software development team to ensure alignment of data solutions with business objectives and overall IT strategy. Data Strategy and Management: Define data strategy and oversee data management within the organization, focusing on data governance, quality, privacy, and security using Databricks and Snowflake technologies. Implementation of Machine Learning Models: Utilize Databricks for implementing machine learning models, conducting data analysis, and deriving insights. Data Migration and Integration: Transfer data from on-premise or other cloud platforms to Snowflake, integrating Snowflake and Databricks with other systems for seamless data flow. Performance Tuning: Optimize database performance by fine-tuning queries, enhancing processing speed, and improving data storage and retrieval mechanisms. Troubleshooting and Problem Solving: Identify and resolve issues related to Database, data migration, data pipelines, and other ETL processes, addressing concerns like data quality, system performance, and data security. Stakeholder Communication: Effectively communicate with stakeholders to grasp requirements and deliver solutions that meet business needs. Requirement Qualifications: Education: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent experience. Experience: Minimum of 8 years of experience in software development and Architecture role. Technical Skills: Proficiency in ETL/ELT processes and tools, particularly SSIS; 5+ years of experience with large data warehousing applications; solid experience with reporting tools like Power BI and Tableau; familiarity with creating batch and real-time jobs with Databricks and Snowflake, and working with streaming platforms like Kafka and Airflow. Soft Skills: Strong leadership and team management skills, problem-solving abilities, and effective communication and interpersonal skills. Preferred Qualifications: Experience with Agile development methodologies. Certification in relevant cloud technologies (e.g., Azure, Databricks, Snowflake). Primary Skills: Azure, Snowflake, Databricks Secondary Skills: SSIS, Power BI, Tableau Role Purpose: The purpose of the role is to create exceptional architectural solution design and thought leadership, enabling delivery teams to provide exceptional client engagement and satisfaction. Key Roles and Responsibilities: Develop architectural solutions for new deals/major change requests, ensuring scalability, reliability, and manageability of systems. Provide solutioning of RFPs from clients, ensuring overall design assurance. Manage the portfolio of to-be-solutions to align with business outcomes, analyzing technology environment, client requirements, and enterprise specifics. Offer technical leadership in designing, developing, and implementing custom solutions using modern technology. Define current and target state solutions, articulate architectural targets, recommendations, and propose investment roadmaps. Evaluate and recommend solutions for integration with the technology ecosystem. Collaborate with IT groups to ensure task transition, performance, and issue resolution. Enable Delivery Teams by providing optimal delivery solutions, building relationships with stakeholders, and developing relevant metrics to drive results. Manage multiple projects, identify risks, ensure quality assurance, and recommend tools for reuse and automation. Support pre-sales teams in presenting solution designs to clients, negotiate requirements, and demonstrate thought leadership. Competency Building and Branding: Develop PoCs, case studies, and white papers, attain market recognition, and mentor team members for career development. Team Management: Resourcing, Talent Management, Performance Management, Employee Satisfaction and Engagement. Join us at Wipro, a business driven by purpose and reinvention, where your ambitions can be realized through constant evolution and empowerment. Applications from individuals with disabilities are encouraged.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a highly skilled and experienced Senior Engineer in Data Science who will be responsible for designing and implementing next-generation data science solutions. Your role will involve shaping the data strategy and driving innovation through advanced analytics and machine learning. In this position, your responsibilities will include providing technical leadership and designing end-to-end data science solutions. This encompasses data acquisition, ingestion, processing, storage, modeling, and deployment. You will also be tasked with developing and maintaining scalable data pipelines and architectures using cloud-based platforms and big data technologies to handle large volumes of data efficiently. Collaboration with stakeholders to define business requirements and translate them into technical specifications is essential. As a Senior Engineer in Data Science, you will select and implement appropriate machine learning algorithms and techniques, staying updated on the latest advancements in AI/ML to solve complex business problems. Building and deploying machine learning models, monitoring and evaluating model performance, and providing technical leadership and mentorship to junior data scientists are also key aspects of this role. Furthermore, you will contribute to the development of data science best practices and standards. To qualify for this position, you should hold a B.Tech/M.Tech/M.Sc (Mathematics/Statistics)/PhD from India or abroad. You are expected to have at least 4+ years of experience in data science and machine learning, with a total of around 7+ years of overall experience. A proven track record of technical leadership and implementing complex data science solutions is required, along with a strong understanding of data warehousing, data modeling, and ETL processes. Expertise in machine learning algorithms and techniques, time series analysis, programming proficiency in Python, knowledge of general data science tools, domain knowledge in Industrial, Manufacturing, and/or Healthcare, proficiency in cloud-based platforms and big data technologies, and excellent communication and collaboration skills are all essential qualifications for this role. Additionally, contributions to open-source projects or publications in relevant fields will be considered an added advantage.,
Posted 2 days ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As a Data Engineer at our company, you will play a crucial role in designing and building pipelines to ingest data into BigQuery for consumption. You will be responsible for monitoring and troubleshooting data pipelines and infrastructure to ensure high availability and performance. To excel in this role, you must have a strong knowledge of Data Warehousing, including ETL pipeline design, development, and maintenance. You should have a minimum of 1 year of technology experience in Data Engineering projects and a minimum of 1 year of experience in GCP. Proficiency in Python programming is essential, along with at least 1 year of experience in SQL/PL SQL Scripting. Additionally, you should have a minimum of 1 year of experience in Data Warehouse/ETL. If you are someone who thrives in a dynamic environment and enjoys collaborating with passionate individuals who are dedicated to their work, then this opportunity is perfect for you. Join us in our mission to bring passion and customer focus to our business. If this role does not align with your current career aspirations, you can express your interest in future opportunities by clicking on Introduce Yourself in the top-right corner of the page or create an account to receive email alerts for new job postings that match your interests.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Database Developer and Designer, you will be responsible for building and maintaining Customer Data Platforms (CDP) databases to ensure performance and stability. Your role will involve optimizing SQL queries to improve performance, creating visual data models, and administering database security. Troubleshooting and debugging SQL code issues will be a crucial part of your responsibilities. You will be involved in data integration tasks, importing and exporting events, user profiles, and audience changes to Google BigQuery. Utilizing BigQuery for querying, reporting, and data visualization will be essential. Managing user and service account authorizations, as well as integrating Lytics with BigQuery and other data platforms, will also be part of your duties. Handling data export and import between Lytics and BigQuery, configuring authorizations for data access, and utilizing data from various source systems to integrate with CDP data models are key aspects of the role. Preferred candidates will have experience with Lytics CDP and CDP certification. Hands-on experience with at least one Customer Data Platform technology and a solid understanding of the Digital Marketing Eco-system are required. Your skills should include proficiency in SQL and database management, strong analytical and problem-solving abilities, experience with data modeling and database design, and the capability to optimize and troubleshoot SQL queries. Expertise in Google BigQuery and data warehousing, knowledge of data integration and ETL processes, familiarity with Google Cloud Platform services, and a strong grasp of data security and access management are essential. You should also be proficient in Lytics and its integration capabilities, have experience with data import/export processes, knowledge of authorization methods and security practices, strong communication and project management skills, and the ability to learn new CDP technologies and deliver in a fast-paced environment. Ultimately, your role is crucial for efficient data management and enabling informed decision-making through optimized database design and integration.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Project Lead (Data) at our esteemed organization, you will play a crucial role in translating business requirements into technical specifications and leading the design, development, and deployment of Business Intelligence (BI) solutions. Your responsibilities will include maintaining and supporting data analytics platforms, collaborating with cross-functional teams, executing database queries and analyses, creating visualizations, and updating technical documentation. To excel in this role, you should possess a minimum of 5 years of experience in designing and implementing reports, dashboards, ETL processes, and data warehouses. Additionally, you should have at least 3 years of direct management experience. A strong understanding of data warehousing and database concepts is essential, along with expertise in BI fundamentals. Proficiency in tools such as Microsoft SQL Server, SSIS, SSRS, Azure Data Factory, Azure Synapse, and Power BI will be highly advantageous. Your role will involve defining software development aspects, communicating concepts and guidelines effectively to the team, providing technical guidance and coaching, and overseeing the progress of report/dashboard development to ensure alignment with data warehouse and RDBMS design principles. Engaging with stakeholders to identify key performance indicators (KPIs) and presenting actionable insights through reports and dashboards will be a key aspect of your responsibilities. The ideal candidate for this position will exhibit proven analytical and problem-solving abilities, possess excellent interpersonal and written communication skills, and be adept at working in a collaborative environment. If you are passionate about leveraging data to drive business decisions and possess the requisite skills and experience, we invite you to join our dynamic team and contribute to our continued success. Join us in our journey of innovation and excellence as we continue to serve our global clientele with end-to-end IT and ICT solutions.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The role involves designing, developing, and maintaining Qlik Sense and Power BI applications to support data analysis and visualization needs of the organization. You will be responsible for designing and creating Qlik Sense and Power BI dashboards and reports based on predefined requirements. You will identify trends, insights, and provide actionable recommendations to business teams. Regularly updating and maintaining existing dashboards to ensure data accuracy will be part of your responsibilities. Implementing security and access controls for dashboards and reports is also key. You will translate data into clear and effective visualizations for business users. Using QlikSense and PowerBI, you will create interactive reports that highlight key metrics and trends. Data gathering and analysis will involve extracting, cleaning, and transforming large datasets from multiple sources, including databases, APIs, and cloud platforms. Ensuring data integrity by performing data quality checks and validation is crucial. Creating DAX queries, calculated columns, and measures in Power BI for advanced analytics, as well as developing QlikSense scripts for data transformations and load processes, will be part of your tasks. Collaboration with business users to understand reporting needs and deliver solutions that meet those requirements is essential. Collaborating with team members to optimize the use of QlikSense and Power BI features and functionalities is expected. You will also assist users with basic troubleshooting related to QlikSense and PowerBI reports and dashboards. Providing user support for navigating and interpreting data visualizations is part of your role. Location: Pune, India Essential Skills: - Strong understanding of data visualization best practices and UI/UX principles. - Experience working with relational databases (SQL Server, MySQL, PostgreSQL, etc.). - Knowledge of ETL processes, data warehousing, and data modeling concepts. - Experience with Power Automate, Power Apps, and Qlik NPrinting (nice to have). - Experience in Python or R for data analysis. Education Requirements & Experience: - 3-6 years of experience in Power BI and Qlik Sense development. - Education: MS or bachelor's degree in engineering, computer science, or a related field.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
jaipur, rajasthan
On-site
Apply Digital is a global digital transformation partner for change agents. Leveraging expertise that spans Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, Marketing Services, Change Management, and beyond, we enable our clients to modernize their organizations and deliver meaningful impact to their business and customers. Our 750+ team members have helped transform global companies like Kraft Heinz, NFL, Moderna, Lululemon, Dropbox, Atlassian, A+E Networks, and The Very Group. Apply Digital was founded in 2016 in Vancouver, Canada, and has grown to nine cities across North America, South America, the UK, and Europe. At Apply Digital, the One Team approach is believed in, where operations are within a pod structure. Each pod combines senior leadership, subject matter experts, and cross-functional skill sets, all working within a common tech and delivery framework. The structure is supported by well-organized scrum and sprint cadences, ensuring teams release often and hold retrospectives to progress towards desired outcomes. Wherever Apply Digital operates globally, it envisions a safe, empowered, respectful, and fun community for its people every single day. The organization works to embody its SHAPE values (smart, humble, active, positive, and excellent) to create a space for the team to connect, grow, and support each other to make a difference. Apply Digital is a hybrid-friendly organization with remote options available if needed. The preferred candidate for the Analytics Implementation Consultant role should be based in or within a location commutable to the Delhi/NCR region of India, working in hours that overlap with the Eastern Standard Timezone (EST). In the initial role, the Analytics Implementation Consultant will support Kraft Heinz, a global leader in consumer packaged foods. Apply Digital aims to drive Kraft Heinz's digital transformation through implementable strategies, cutting-edge technology, and data-driven innovation to enhance consumer engagement and maximize business value. The role involves designing, implementing, and maintaining digital analytics solutions in collaboration with developers, data engineers, and product teams to ensure scalable and reliable data collection. Expertise in digital analytics platforms, tag management systems, JavaScript, SQL, and data layers is required, along with strong English language proficiency and experience working with remote teams. Responsibilities of the Analytics Implementation Consultant include supporting the development and implementation of robust analytics systems, collaborating with analysts and stakeholders to translate business problems into analytics solutions, QA testing data capture and reports, supporting the creation of presentations and recommendations, staying updated on technical trends and best practices, and contributing to the direction of the Data & Analytics Discipline. The ideal candidate for this role should have a strong proficiency in English, experience working with remote teams, 5+ years of analytics implementation experience, expertise with analytics platforms and tag management systems, front-end web development skills, experience with customer data platforms and mobile app analytics, understanding of statistical analysis and machine learning concepts, familiarity with data modeling and architecture principles, and the ability to manage multiple projects concurrently. A bachelor's degree in Computer Science, Data Science, Analytics, or Engineering is required. Experience with optimization tools is a plus. Apply Digital offers a hybrid-friendly work environment, comprehensive benefits including healthcare coverage and contributions to Provident fund, a gratuity bonus after five years of service, flexible PTO, engaging projects with international brands, an inclusive and safe workplace, generous training budgets, and a commitment to diversity, equity, and inclusion. Apply Digital values equal opportunity and nurtures an inclusive workplace where individual differences are recognized and celebrated. For more information, visit Apply Digital's Diversity, Equity, and Inclusion (DEI) page. Special needs or accommodations during the recruitment process can be requested by emailing india.careers@applydigital.com.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As the Head of Business Intelligence, you will be responsible for developing and implementing a comprehensive business intelligence strategy. Your duties will include overseeing the design and maintenance of data systems and dashboards, collaborating with cross-functional teams to identify data needs and opportunities, and analyzing complex datasets to provide actionable insights and recommendations. Additionally, you will lead and mentor a team of data analysts and BI professionals to ensure data accuracy, security, and compliance with regulations. It will be your responsibility to monitor industry trends and emerging technologies to enhance BI capabilities and to present findings and strategies to senior leadership and stakeholders. You will also lead and organize the BI Operations team and support workstream projects in the implementation and use of new BI software tools and systems. To excel in this role, you should have proven experience as a Head of Business Intelligence, Operations Director, or similar leadership role. A strong background in data analytics, leadership, and strategic planning is essential, with a proven ability to translate complex data into actionable insights. Excellent leadership skills are required, along with experience in leading a team of reporting and analytics professionals. Your knowledge should encompass data analytics and reporting, as well as a strong understanding of database administration, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing (OLAP). Proficiency in IT skills and industry-specific software or programs is necessary, along with commercial awareness and a thorough understanding of the competitive environment. Project management skills are essential, including knowledge of methodologies such as Agile, Lean, and Six Sigma. Time management skills and the ability to prioritize effectively and delegate when appropriate are crucial for success in this role. Proficiency in Power BI and query experience is also required. You will be expected to utilize various tools in this role, including Business Intelligence (BI) tools such as Power BI, SAP, and Tableau, as well as the Microsoft BI stack: Power Pivot, SSIS, SSRS, and SSAS. Familiarity with project management software will also be beneficial for effectively managing BI operations and projects.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
You are an experienced ETL Developer with over 5 years of expertise in data transformation, integration, and visualization. Your role will involve designing, implementing, and maintaining ETL processes using SQL Server Integration Services (SSIS) for efficient data extraction, transformation, and loading. You will be responsible for developing data transformation and integration solutions, ensuring data consistency and accuracy across systems. Performing data cleansing, validation, and enrichment to ensure high-quality data for reporting and analysis will also be part of your responsibilities. Analyzing large datasets to uncover trends, patterns, and insights, as well as performing data mining to support decision-making processes, will be crucial. You will create and maintain visualizations to present data insights effectively to stakeholders. Developing and maintaining database objects including tables, stored procedures, triggers, and user functions within a SQL Server environment is another key aspect of your role. Your expertise in implementing end-to-end BI solutions using the MSBI stack, including SSRS (SQL Server Reporting Services) and SSAS (SQL Server Analysis Services), will be utilized. Additionally, you will design and manage data warehouse solutions, optimize T-SQL query performance, and create and manage reusable SSIS components for streamlined ETL processes. Contributing to physical and logical database design, data mapping, and table normalization will also be part of your responsibilities. Your role will require you to identify dimensions, facts, measures, and hierarchies for data migration to SQL Server databases. Utilizing DMVs, SQL Profiler, and Extended Events for database performance optimization, debugging, and tuning will be essential. Working with Microsoft Azure cloud services including Azure Blob Storage, Azure SQL Server, and Azure Data Factory is also a part of this role. You will use GIT (Azure DevOps) for version control and collaboration, and participate in Agile/SCRUM methodologies for project management and development. Minimum qualifications include a minimum of 5 years of experience in ETL development and data warehousing, advanced proficiency in T-SQL, SSIS, and SQL Server database management, experience with MSBI stack including SSRS and SSAS, knowledge of data warehousing methodologies and concepts, skills in optimizing T-SQL queries and database performance, experience with Microsoft Azure cloud services, familiarity with Agile/SCRUM project management methodologies, and proficiency in GIT version control and Azure DevOps. Preferred qualifications include a Bachelor's degree in Computer Science, Information Systems, or a related field, certifications in Microsoft SQL Server or Azure Data Technologies, and experience with additional BI tools and data visualization platforms.,
Posted 2 days ago
3.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Senior Programmer Analyst role is an intermediate level position where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective in this role will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be required to monitor and control all phases of the development process including analysis, design, construction, testing, and implementation. Providing user and operational support on applications to business users is a key part of your role. Utilizing your in-depth specialty knowledge of applications development, you will analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. You will be responsible for recommending and developing security measures in post-implementation analysis of business usage to ensure successful system design and functionality. Additionally, consulting with users/clients and other technology groups, recommending advanced programming solutions, and installing and assisting customer exposure systems are part of your duties. You will need to ensure that essential procedures are followed, help define operating standards and processes, and serve as an advisor or coach to new or lower-level analysts. It is essential to appropriately assess risk when making business decisions and demonstrate particular consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets. To qualify for this role, you should have 9+ years of relevant experience in Apps Development or systems analysis role, extensive experience in system analysis and programming of Dataware house projects, and experience in managing and implementing successful projects. Expertise in creating T SQL queries, stored procedures, functions, and triggers using SQL Server 2019 or later versions is required. Proficiency in data warehousing & relational model concepts, designing/developing SSIS packages, developing Dashboards and Reports using Qlikview or SSRS, working on BAU JIRAs, providing detailed analysis and documentation of processes, working on DevOps tools, and experience with Big Data Development technologies are also necessary qualifications. You should also have experience in systems analysis and programming of software applications, ability to work under pressure, manage deadlines or unexpected changes in expectations or requirements, 3+ years of experience in leading small to medium-size development teams, provide technical leadership and mentorship to junior developers, and consistently demonstrate clear and concise written and verbal communication skills. A Bachelor's degree or equivalent experience is required for this position. This job description provides a high-level overview of the work performed in the Applications Development Senior Programmer Analyst role. Other job-related duties may be assigned as required. Citi is an equal opportunity and affirmative action employer, providing career opportunities for all qualified interested applicants.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
bhubaneswar
On-site
As a Pyspark Developer_VIS, your primary responsibility will be to develop high-performance Pyspark applications for large-scale data processing. You will collaborate with data engineers and analysts to integrate data pipelines and design ETL processes using Pyspark. Optimizing existing data models and workflows to enhance overall performance is also a key aspect of your role. Additionally, you will need to analyze large datasets to derive actionable insights and ensure data quality and integrity throughout the data processing lifecycle. Utilizing SQL for querying databases and validating data is essential, along with working with cloud technologies to deploy and maintain data solutions. You will participate in code reviews, maintain version control, and document all processes, workflows, and system changes clearly. Providing support in resolving production issues and assisting stakeholders, as well as mentoring junior developers on best practices in data processing, are also part of your responsibilities. Staying updated on emerging technologies and industry trends, implementing data security measures, contributing to team meetings, and offering insights for project improvements are other expectations from this role. Qualifications required for this position include a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3+ years of experience in Pyspark development and data engineering. Strong proficiency in SQL and relational databases, experience with ETL tools and data processing frameworks, familiarity with Python for data manipulation and analysis, and knowledge of big data technologies such as Apache Hadoop and Spark are necessary. Experience working with cloud platforms like AWS or Azure, understanding data warehousing concepts and strategies, excellent problem-solving and analytical skills, attention to detail, commitment to quality, ability to work independently and as part of a team, excellent communication and interpersonal skills, experience with version control systems like Git, managing multiple priorities in a fast-paced environment, willingness to learn and adapt to new technologies, strong organizational skills, and meeting deadlines are also essential for this role. In summary, the ideal candidate for the Pyspark Developer_VIS position should possess a diverse skill set including cloud technologies, big data, version control, data warehousing, Pyspark, ETL, Python, Azure, Apache Hadoop, data analysis, Apache Spark, SQL, AWS, and more. ,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Backend developer / Data Engineer at Siemens Energy, you will lead a team of Data Engineers to translate business requirements into analytical products and drive the company's transformation into a data-driven organization. You will play a pivotal role in the energy transition by contributing to the Data Analytics & AI team. Your responsibilities will include driving technical implementation and strategy execution, defining the Minimum Viable Product (MVP) scope, collaborating with data owners, ensuring the performance and quality of integration pipelines, and promoting data literacy through self-service visualization capabilities for end users. To excel in this role, you should have extensive experience in data architecture, data acquisition, data modeling, and analytics. You must also possess strong leadership skills and the ability to lead a data operations team effectively. Additionally, you should be well-versed in database and data warehouse modeling, developing data pipelines for cloud and hybrid infrastructures, and using modern software development tools. The team you will be a part of, within the enterprise team for Advanced Analytics & AI, focuses on developing methodologies and tools to enhance data-driven decision-making in energy solutions worldwide. Furthermore, as part of Siemens Energy, you will contribute to meeting global energy demands while prioritizing climate protection and sustainability. Siemens Energy values diversity and inclusion, recognizing the power of creative energy generated through over 130 nationalities within the organization. Employees are encouraged to make a difference and contribute to the company's focus on decarbonization, new technologies, and energy transformation. In terms of benefits, Siemens Energy offers Remote Working arrangements, Medical Insurance coverage for employees and their families, and Meal Card options, among other perks. Join Siemens Energy in shaping the future of sustainable and affordable energy solutions and be a part of a global team dedicated to innovation and positive impact. For more information on how you can contribute to Siemens Energy's mission, visit: [Siemens Energy Careers Page](https://www.siemens-energy.com/employeevideo),
Posted 2 days ago
0.0 - 4.0 years
0 Lacs
chennai, tamil nadu
On-site
BONbLOC is a 5-year-old, fast-growing software and services company, certified as a Great Place to Work. With a team of over 200 professionals in India and the US, we specialize in developing SaaS solutions for supply chain data collection and analysis using Blockchain, Data Science, and IOT technologies. Our services group offers offshore and onsite support to large customers for their IT modernization projects involving technologies like Mainframe, AS400, Cognos, Oracle, .Net, Angular, Java, Tableau, Xamarin, Android, and more. We focus on building and implementing SaaS products on blockchain, IOT, and AI for supply chain monitoring and tracking, as well as providing modernization solutions through a range of technologies. Our mission is to create simple, scalable solutions using Blockchain, IoT, and AI that deliver unprecedented business value to our customers year after year. Our vision is to evolve into an advanced information technology company driven by happy, intellectual, and highly capable individuals. At BONbLOC, we uphold core values such as Integrity, Collaboration, Innovation, and Excellence. As part of our Business Development training program, we aim to groom candidates for a challenging and fulfilling career in business development at BONbLOC. This program is designed as a multi-track graduate-level curriculum, delivered through a proctored self-learning boot camp. It is a significant investment by the company, with expectations for high performance and strict adherence to procedures. Requirements for the program include holding a degree from a recognized institution, demonstrating high academic performance, and proficiency in spoken and written English. We seek self-motivated, mature, and responsible learners who are willing to put in the hard work required to achieve the program's learning objectives. Candidates selected for the program will be stationed at our Chennai office for work purposes. Key responsibilities will include identifying and researching potential clients and business opportunities, utilizing various channels for lead generation, analyzing market trends, gathering competitor data, reaching out to clients, and maintaining strong client relationships. Additionally, responsibilities will involve creating sales presentations, proposals, and contracts, as well as preparing reports on sales activities and outcomes. In return, we offer a competitive salary and benefits package, opportunities for professional growth and development, a supportive work environment, and exposure to various business aspects and hands-on experience with real-world projects.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the Providence Cybersecurity (CYBR) team, you will play a crucial role in safeguarding all information pertaining to caregivers, affiliates, and confidential business data. Your responsibilities will include collaborating with Product Management to assess use cases, functional requirements, and technical specifications. You will conduct data discovery and analysis to identify crucial data from source systems for meeting business needs. Additionally, you will be tasked with developing conceptual and logical data models to validate requirements, highlighting essential entities, relationships, and documenting assumptions and risks. Your role will also involve translating logical data models into physical data models, creating source-to-target mapping documentation, and defining transformation rules. Supporting engineering teams in implementing physical data models, applying transformation rules, and ensuring compliance with data governance, security frameworks, and encryption mechanisms in cloud environments will be a key part of your responsibilities. Furthermore, you will lead a team of data engineers in designing, developing, and implementing cloud-based data solutions using Azure Databricks and Azure native services. The ideal candidate for this role will possess a Bachelor's degree in a related field such as computer science, along with certifications in Data Engineering, cyber security, or equivalent experience. Experience in working with large and complex data environments, expertise in data integration patterns and tools, and a solid understanding of cloud computing concepts and distributed computing principles are essential. Proficiency in Databricks, Azure Data Factory (ETL Pipelines), and Medallion Architecture, along with hands-on experience in designing and implementing data solutions using Azure cloud services, is required. Strong skills in SQL, Python, Spark, data modelling techniques, dimensional modelling, and data warehousing concepts are crucial for this role. Relevant certifications such as Microsoft Certified: Azure Solutions Architect Expert or Microsoft Certified: Azure Data Engineer Associate are highly desirable. Excellent problem-solving, analytical, leadership, and communication skills are essential for effectively communicating technical concepts and strategies to stakeholders at all levels. You should also demonstrate the ability to lead cross-functional teams, drive consensus, and achieve project goals in a dynamic and fast-paced environment.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Database Engineer at our company, you will be responsible for designing, developing, and implementing complex database solutions. This includes creating data models, tables, indexes, and constraints to ensure efficient database operations. Your expertise in PL/SQL will be crucial as you develop code for stored procedures, functions, triggers, and packages to automate tasks and enforce business rules effectively. Database performance tuning will be a key aspect of your role, where you will identify bottlenecks and apply optimization strategies to enhance query performance and system efficiency. Monitoring the health of the database to prevent downtime and addressing potential issues promptly will also be part of your responsibilities. You will play a vital role in data integration and ETL processes by designing and implementing solutions that extract, transform, and load data from various sources into the target database. Ensuring data consistency and quality throughout the integration process will be essential for maintaining accurate and reliable information. Implementing robust security measures to safeguard sensitive data, including user access controls, encryption, and data masking, will be a priority. Staying updated on security best practices and industry standards will be crucial to maintaining a secure database environment. Collaboration and communication will be key skills required for this role as you collaborate with development teams, data analysts, and business stakeholders to understand their needs and provide effective solutions. Clear and concise communication of technical concepts to both technical and non-technical audiences will be essential. Required Skills and Experience: - Strong PL/SQL development and Oracle database administration skills - Proficiency in SQL, SQL tuning, and performance optimization methods - Experience with data modeling and database design - Knowledge of data integration and ETL processes - Understanding of database security best practices - Strong troubleshooting and problem-solving abilities - Ability to work independently and collaboratively Preferred Skills: - Experience with data warehousing and business intelligence concepts - Familiarity with cloud-based database systems like AWS RDS and Oracle Cloud Infrastructure - Knowledge of programming languages for automation such as Python and Perl - Certification in Oracle Database Administration or PL/SQL If you are a highly skilled database engineer with a passion for data and a strong understanding of Oracle technologies, we welcome you to apply for this exciting opportunity. Please note that this is a full-time position based in Bangalore, Chennai, Delhi, Hyderabad, Kolkata, Navi Mumbai, Pune, or Vadodara. The work mode is onsite, and the ideal candidate should have 6 to 8 years of experience in the field.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and Community Banking in Data Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. You will execute software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Your role involves creating secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. You will produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development. Additionally, you will gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifying hidden problems and patterns in data and using these insights to drive improvements to coding hygiene and system architecture is also a crucial aspect of your responsibilities. Furthermore, you will contribute to software engineering communities of practice and events that explore new and emerging technologies, adding to the team culture of diversity, equity, inclusion, and respect. The required qualifications, capabilities, and skills for this role include formal training or certification on software engineering concepts and 3+ years of applied experience. You should have full Software Development Life Cycle experience within an Agile framework and expert-level implementation skills with Java, AWS, Database technologies, Python, Scala, Spark, and Ab Initio. Experience with the development and decomposition of complex SQL (RDMS Platforms) and Data Warehousing concepts, such as Star Schema, is essential. Practical experience in delivering projects in Data and Analytics, Big Data, Data Warehousing, Business Intelligence, and familiarity with relevant technological solutions and industry best practices are also required. A good understanding of data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, management, integration, consumption) is necessary. Familiarity with multiple Data & Analytics technology stacks and awareness of various Data & Analytics tools and techniques (e.g., Python, data mining, predictive analytics, machine learning, data modeling, etc.) are important aspects of this role. Experience with one or more leading cloud providers (AWS/Azure/GCP) is also a requirement. Preferred qualifications, capabilities, and skills include the ability to work fast and quickly ramp up on new technologies and strategies, work collaboratively in teams to develop meaningful relationships to achieve common goals, appreciation of Controls and Compliance processes for applications and data, in-depth understanding of data technologies and solutions, drive process improvements and implement process changes as necessary, and knowledge of industry-wide Big Data technology trends and best practices.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should be a skilled and experienced Spark Scala Developer with a strong expertise in AWS cloud services and SQL to join our data engineering team. Your primary responsibility will be to design, build, and optimize scalable data processing systems that support our data platform. Your key responsibilities will include developing and maintaining large-scale distributed data processing pipelines using Apache Spark with Scala, working with AWS services (S3, EMR, Lambda, Glue, Redshift, etc.) to build and manage data solutions in the cloud, writing complex SQL queries for data extraction, transformation, and analysis, optimizing Spark jobs for performance and cost-efficiency, collaborating with data scientists, analysts, and other developers to understand data requirements, building and maintaining data lake and data warehouse solutions, implementing best practices in coding, testing, and deployment, and ensuring data quality and consistency across systems. To be successful in this role, you should have strong hands-on experience with Apache Spark (preferably using Scala), proficiency in the Scala programming language, solid experience with SQL (including complex joins, window functions, and performance tuning), working knowledge of AWS services like S3, EMR, Glue, Lambda, Athena, Redshift, experience in building and maintaining ETL/ELT pipelines, familiarity with data modeling and data warehousing concepts, experience with version control (e.g., Git) and CI/CD pipelines is a plus, and strong problem-solving and communication skills.,
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough