Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
0 Lacs
haryana
On-site
As a Data Engineer Intern at Uptitude, you will have the exciting opportunity to work on clean and scalable pipelines, applying your passion for data engineering to solve real-world problems. You will thrive in our fast-paced and dynamic start-up environment, based in our vibrant office in Gurugram, India. Uptitude is a forward-thinking consultancy that specializes in providing outstanding data, AI, and business intelligence solutions to clients worldwide. With our headquarters in London, UK, and teams spanning across India and Europe, we are dedicated to empowering businesses with data-driven insights that drive action and growth. Innovation, excellence, and collaboration are the cornerstones of our work at Uptitude. Your role as a Data Engineer will involve designing, developing, and optimizing data pipelines and infrastructure that support analytics, machine learning, and operational reporting. You will collaborate closely with analysts, BI engineers, data scientists, and business stakeholders to enhance our clients" data capabilities. To excel in this role, you should hold a degree in Computer Science, Engineering, or a related field. Proficiency in SQL and a programming language (preferably Python), an interest in data modeling, ETL processes, and cloud platforms (such as AWS, GCP, or Azure), as well as a basic understanding of databases, data types, and data wrangling are essential. A strong attention to detail and a willingness to learn in a fast-paced environment will be key to your success. At Uptitude, we uphold a set of core values that shape our work culture: - Be Awesome: Strive for excellence, continuously improve your skills, and deliver exceptional results. - Step Up: Take ownership of challenges, be proactive, and seek opportunities to contribute beyond your role. - Make a Difference: Embrace innovation, think creatively, and contribute to the success of our clients and the company. - Have Fun: Foster a positive work environment, celebrate achievements, and build strong relationships. We value our employees and offer a competitive benefits package, including a salary commensurate with experience, private health insurance coverage, offsite team-building trips, quarterly outings for unwinding and celebrating achievements, and corporate English lessons with a UK instructor. Join our fast-growing company with a global client base and seize the opportunity to grow and develop your skills in a dynamic and exciting environment. Apply now to be part of a team that is transforming data into opportunity at Uptitude.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The Director, Business Insights is a key role within the Business Insights Team, which is dedicated to enabling all services to make data-driven decisions and operate efficiently. You will play a crucial role in partnering with leadership across various functions such as Sales, Delivery, Product, Finance, and more to enhance strategic decision-making through data and facts. Your responsibilities will include diagnosing strategic gaps and opportunities within operations and implementing corrective measures. Additionally, you will be involved in building data-driven infrastructure, driving productivity enhancements, and identifying technology solutions to meet business needs. In this role, you will influence decision-making processes within a dedicated function by providing data-driven insights to functional leaders. You will establish measurement frameworks, KPIs, and analysis questions to evaluate the health of the business. Your focus will be on optimizing team members" time on core activities by automating processes, simplifying workflows, and fostering collaboration. You will also lead special projects that require cross-functional collaboration, such as M&A integration and Agile initiatives. Furthermore, you will oversee the development of the Services analytic infrastructure to ensure optimal system configuration and centralized data aggregation. To excel in this position, you should possess excellent problem-solving skills, with an emphasis on scalable and automated frameworks and processes. A deep understanding of the business landscape relevant to the Services function is essential, along with proficiency in managing various data sets and utilizing data analytic tools. Your technical acumen, coupled with a generalist mindset and strong communication skills, will be instrumental in driving success in this role. Prior consulting experience would be advantageous. As a representative of Mastercard, you are expected to prioritize corporate security responsibilities. This includes adhering to security policies, safeguarding the confidentiality and integrity of information accessed, reporting any security violations or breaches, and participating in mandatory security training sessions. Join us in this dynamic role as Director, Business Insights and contribute to the growth and success of our data-driven decision-making processes.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
indore, madhya pradesh
On-site
The BI Developer role involves designing, developing, and maintaining business intelligence solutions to facilitate data-driven decision-making within the organization. You will collaborate with stakeholders to comprehend their requirements and convert them into technical specifications. Your responsibilities will include developing data models, designing ETL processes, and creating reports and dashboards. You will be accountable for: - Designing, developing, and maintaining business intelligence solutions encompassing data models, ETL processes, reports, and dashboards. - Collaborating closely with stakeholders to gather business requirements and translate them into technical specifications. - Creating and managing ETL processes for extracting, transforming, and loading data from diverse sources into the data warehouse. - Building and managing data models to enable efficient querying and analysis of extensive datasets. - Developing and maintaining reports and dashboards utilizing BI tools like Tableau, Power BI. - Monitoring and troubleshooting BI solutions to ensure optimal performance and data accuracy. - Coordinating with data engineers and database administrators to guarantee proper data storage and optimization for reporting and analysis. - Documenting BI solutions, including data dictionaries, technical specifications, and user manuals. - Keeping abreast of industry trends in business intelligence and data analytics. - Collaborating with other developers and stakeholders to align BI solutions with the organization's goals. Requirements: - Bachelor's degree in computer science, information systems, or a related field. - Minimum 4 years of experience in developing business intelligence solutions. - Proficiency in SQL, ETL processes, and data modeling. - Experience with data visualization best practices and techniques. - Strong analytical and problem-solving abilities. - Excellent communication and collaboration skills. - Experience in developing SSRS Power BI Reports and designing Tabular Cube / Multidimensional. Application Question(s): - How many years of hands-on experience do you have with Azure Services like ADF and Synapse - How many years of experience do you have working with Power BI, SQL, and ETL tools This is a full-time, permanent position with benefits including health insurance and provident fund. The work schedule is a fixed day shift from Monday to Friday, to be carried out in person.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a highly skilled and experienced Senior Engineer in Data Science who will be responsible for designing and implementing next-generation data science solutions. Your role will involve shaping the data strategy and driving innovation through advanced analytics and machine learning. In this position, your responsibilities will include providing technical leadership and designing end-to-end data science solutions. This encompasses data acquisition, ingestion, processing, storage, modeling, and deployment. You will also be tasked with developing and maintaining scalable data pipelines and architectures using cloud-based platforms and big data technologies to handle large volumes of data efficiently. Collaboration with stakeholders to define business requirements and translate them into technical specifications is essential. As a Senior Engineer in Data Science, you will select and implement appropriate machine learning algorithms and techniques, staying updated on the latest advancements in AI/ML to solve complex business problems. Building and deploying machine learning models, monitoring and evaluating model performance, and providing technical leadership and mentorship to junior data scientists are also key aspects of this role. Furthermore, you will contribute to the development of data science best practices and standards. To qualify for this position, you should hold a B.Tech/M.Tech/M.Sc (Mathematics/Statistics)/PhD from India or abroad. You are expected to have at least 4+ years of experience in data science and machine learning, with a total of around 7+ years of overall experience. A proven track record of technical leadership and implementing complex data science solutions is required, along with a strong understanding of data warehousing, data modeling, and ETL processes. Expertise in machine learning algorithms and techniques, time series analysis, programming proficiency in Python, knowledge of general data science tools, domain knowledge in Industrial, Manufacturing, and/or Healthcare, proficiency in cloud-based platforms and big data technologies, and excellent communication and collaboration skills are all essential qualifications for this role. Additionally, contributions to open-source projects or publications in relevant fields will be considered an added advantage.,
Posted 1 month ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
Seeking an experienced Senior Business Intelligence Expert with deep expertise in PowerBI development and a proven track record of creating high-performance, visually compelling business intelligence solutions. The ideal candidate will have extensive experience in semantic modeling, data pipeline development, and API integration, with the ability to transform complex data into actionable insights through intuitive dashboards that follow consistent branding guidelines and utilize advanced visualizations. As a Senior Business Intelligence Expert, you will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive business decisions across the organization. Your expertise in data modeling, ETL processes, and visualization best practices will be essential in delivering high-quality BI assets that meet performance standards and provide exceptional user experiences. Lead the optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. Enhance the BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. Develop and maintain complex data models using PowerBI's semantic modeling capabilities to ensure data accuracy, consistency, and usability. Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. Design and implement automated processes for integrating data from various API sources. Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. Provide technical leadership and mentoring to junior BI developers. Document technical specifications, data dictionaries, and user guides for all BI solutions. Minimum 15+ years of experience in business intelligence, data analytics, or related field. Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. Strong expertise in semantic modeling principles and best practices. Extensive experience with custom visualizations and complex dashboard design. Proficient in SQL for data manipulation and optimization. Experience with Python for data processing and ETL workflows. Proven track record of API integration and data ingestion from diverse sources. Strong understanding of data warehouse concepts and dimensional modeling. Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). The ideal candidate will also possess knowledge and experience with emerging technologies and advanced PowerBI capabilities that can further enhance our BI ecosystem. Nice to Have Skills: Experience implementing AI-powered analytics tools and integrating them with PowerBI. Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. Knowledge of writeback architecture and implementation in PowerBI solutions. Experience with PowerBI APIs for custom application integration and automation. Familiarity with DevOps practices for BI development and deployment. Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an opportunity to work with cutting-edge business intelligence technologies while delivering impactful solutions that drive organizational success through data-driven insights. Come as You Are. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Database Developer and Designer, you will be responsible for building and maintaining Customer Data Platforms (CDP) databases to ensure performance and stability. Your role will involve optimizing SQL queries to improve performance, creating visual data models, and administering database security. Troubleshooting and debugging SQL code issues will be a crucial part of your responsibilities. You will be involved in data integration tasks, importing and exporting events, user profiles, and audience changes to Google BigQuery. Utilizing BigQuery for querying, reporting, and data visualization will be essential. Managing user and service account authorizations, as well as integrating Lytics with BigQuery and other data platforms, will also be part of your duties. Handling data export and import between Lytics and BigQuery, configuring authorizations for data access, and utilizing data from various source systems to integrate with CDP data models are key aspects of the role. Preferred candidates will have experience with Lytics CDP and CDP certification. Hands-on experience with at least one Customer Data Platform technology and a solid understanding of the Digital Marketing Eco-system are required. Your skills should include proficiency in SQL and database management, strong analytical and problem-solving abilities, experience with data modeling and database design, and the capability to optimize and troubleshoot SQL queries. Expertise in Google BigQuery and data warehousing, knowledge of data integration and ETL processes, familiarity with Google Cloud Platform services, and a strong grasp of data security and access management are essential. You should also be proficient in Lytics and its integration capabilities, have experience with data import/export processes, knowledge of authorization methods and security practices, strong communication and project management skills, and the ability to learn new CDP technologies and deliver in a fast-paced environment. Ultimately, your role is crucial for efficient data management and enabling informed decision-making through optimized database design and integration.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Navi is a rapidly growing financial services company in India that offers a range of products including Personal & Home Loans, UPI, Insurance, Mutual Funds, and Gold. The mission of Navi is to provide digital-first financial solutions that are easy to use, accessible, and affordable. Leveraging our in-house AI/ML capabilities, technology, and product expertise, Navi is dedicated to creating exceptional customer experiences. As a Navi_ite, you should embody the following qualities: - Perseverance, Passion, and Commitment: Demonstrate dedication, passion for Navi's mission, and ownership by going above and beyond in your responsibilities. - Obsession with high-quality results: Consistently deliver value to customers and stakeholders by producing high-quality outcomes, ensuring excellence in all work aspects, and achieving high standards through efficient time management. - Resilience and Adaptability: Quickly adapt to new roles, responsibilities, and changing circumstances while demonstrating resilience and agility. The role involves: - Monitoring delinquent accounts to meet Company standards - Evaluating underwriting model stability and recommending policy actions to increase approval rates or reduce risk - Developing and maintaining loss forecasting models and ECL provisioning for accounting purposes - Implementing a portfolio monitoring and early warning alert system for loan book health - Utilizing data for decision-making, ensuring data accuracy, defining metrics for process improvement, and identifying areas for product and process enhancement through customer segmentation and analysis - Working with the engineering and data platform team to ensure data availability and reliability Desired Candidate Profile: - Experience in managing multiple stakeholders - Startup mindset - Team building skills - Proficiency in SQL, any BI platform (Tableau, PowerBI, Qlikview, Looker, Quicksight, etc.) - Understanding of basic statistical concepts and data platforms - Ability to identify relevant data for solving business problems and validating hypotheses - Prior experience in data modeling in R/Python and classification/regression techniques (Good to have) - Prior experience/understanding of lending/insurance/banking domain (Good to have),
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The role involves designing, developing, and maintaining Qlik Sense and Power BI applications to support data analysis and visualization needs of the organization. You will be responsible for designing and creating Qlik Sense and Power BI dashboards and reports based on predefined requirements. You will identify trends, insights, and provide actionable recommendations to business teams. Regularly updating and maintaining existing dashboards to ensure data accuracy will be part of your responsibilities. Implementing security and access controls for dashboards and reports is also key. You will translate data into clear and effective visualizations for business users. Using QlikSense and PowerBI, you will create interactive reports that highlight key metrics and trends. Data gathering and analysis will involve extracting, cleaning, and transforming large datasets from multiple sources, including databases, APIs, and cloud platforms. Ensuring data integrity by performing data quality checks and validation is crucial. Creating DAX queries, calculated columns, and measures in Power BI for advanced analytics, as well as developing QlikSense scripts for data transformations and load processes, will be part of your tasks. Collaboration with business users to understand reporting needs and deliver solutions that meet those requirements is essential. Collaborating with team members to optimize the use of QlikSense and Power BI features and functionalities is expected. You will also assist users with basic troubleshooting related to QlikSense and PowerBI reports and dashboards. Providing user support for navigating and interpreting data visualizations is part of your role. Location: Pune, India Essential Skills: - Strong understanding of data visualization best practices and UI/UX principles. - Experience working with relational databases (SQL Server, MySQL, PostgreSQL, etc.). - Knowledge of ETL processes, data warehousing, and data modeling concepts. - Experience with Power Automate, Power Apps, and Qlik NPrinting (nice to have). - Experience in Python or R for data analysis. Education Requirements & Experience: - 3-6 years of experience in Power BI and Qlik Sense development. - Education: MS or bachelor's degree in engineering, computer science, or a related field.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, you will be part of a team of innovative professionals working with cutting-edge technologies. Our purpose is anchored in bringing real positive changes in an increasingly virtual world, transcending generational gaps and future disruptions. We are currently seeking SQL Professionals for the role of Data Engineer with 4-6 years of experience. The ideal candidate must have a strong academic background. As a Data Engineer at BNY Mellon in Pune, you will be responsible for designing, developing, and maintaining scalable data pipelines and ETL processes using Apache Spark and SQL. You will collaborate with data scientists and analysts to understand data requirements, optimize and query large datasets, ensure data quality and integrity, implement data governance and security best practices, participate in code reviews, and troubleshoot data-related issues promptly. Qualifications for this role include 4-6 years of experience in data engineering, proficiency in SQL and data processing frameworks like Apache Spark, knowledge of database technologies such as SQL Server or Oracle, experience with cloud platforms like AWS, Azure, or Google Cloud, familiarity with data warehousing solutions, understanding of Python, Scala, or Java for data manipulation, excellent analytical and problem-solving skills, and good communication skills to work effectively in a team environment. Joining YASH means being empowered to shape your career in an inclusive team environment. We offer career-oriented skilling models and promote continuous learning, unlearning, and relearning at a rapid pace. Our workplace is based on four principles: flexible work arrangements, free spirit, and emotional positivity; agile self-determination, trust, transparency, and open collaboration; all support needed for the realization of business goals; and stable employment with a great atmosphere and ethical corporate culture.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
jaipur, rajasthan
On-site
We are seeking a Data Cloud Integration Specialist to oversee the management of data ingestion, unification, and activation in Salesforce Data Cloud and external systems. As the ideal candidate, you will need to have practical experience in constructing robust integration pipelines, collaborating with APIs, and facilitating a seamless data flow among various platforms. Your responsibilities will include designing and executing data ingestion workflows from diverse sources into Salesforce Data Cloud, harmonizing data from multiple systems to establish a comprehensive 360 customer view, and establishing and sustaining integrations utilizing APIs, ETL tools, and middleware solutions. You will work closely with data architects, developers, and business teams to acquire integration requirements, monitor and enhance integration performance for accuracy and real-time data availability, and ensure compliance with data privacy and governance policies. Additionally, you will be responsible for activating unified data for utilization across Salesforce Marketing, Sales, and Service Clouds. Key Requirements: - Proficient in Salesforce Data Cloud (formerly Salesforce CDP). - Strong expertise in ETL processes, data mapping, and transformation logic. - Hands-on experience with REST/SOAP APIs and integration tools like MuleSoft or equivalent. - Sound understanding of data modeling, schema design, and customer data platforms. - Familiarity with data privacy regulations such as GDPR and CCPA. Preferred Qualifications: - Familiarity with Salesforce Marketing Cloud, Service Cloud, and Customer 360. - Experience with cloud data platforms like Snowflake, Redshift, or BigQuery. - Possession of Salesforce certifications like Data Cloud Consultant or Integration Architect is advantageous. This is a Contractual / Temporary position with a work schedule of Monday to Friday. The ideal candidate should have at least 6 years of experience as a Data Integration Specialist and 5 years of experience with Salesforce Data Cloud. Work Location: In person,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking an experienced Power BI Developer with over 4 years of expertise in creating, managing, and enhancing business intelligence solutions utilizing Microsoft Power BI. The ideal candidate should possess a profound understanding of data analytics, data modeling, and visualization techniques, along with a strong track record of delivering significant business insights. Your responsibilities will include designing, developing, and maintaining Power BI reports, dashboards, and visualizations to align with business objectives. You will collaborate with stakeholders to gather requirements and convert business needs into technical specifications. Additionally, optimizing DAX queries and data models, integrating Power BI solutions with other applications, and leading the end-to-end Power BI development lifecycle will be crucial tasks. Furthermore, you will be responsible for implementing and managing row-level security and data governance policies, monitoring and enhancing report performance, collaborating with cross-functional teams, and maintaining comprehensive documentation of reporting structures and data sources. Staying updated on the latest Power BI features and industry trends to implement best practices is essential. Qualifications: - Over 4 years of experience in developing business intelligence solutions using Power BI. - Expertise in Power BI Desktop, Power BI Service, Power Query, and DAX. - Strong understanding of data modeling concepts and experience with data transformation using Power Query. - Advanced SQL skills for querying and transforming data from various sources. - Proven experience in building complex data models with relationships, hierarchies, and security models. - Knowledge of ETL processes and integrating data from diverse sources. - Familiarity with Azure Data Services or similar platforms is a plus. - Experience working in Agile environments and familiarity with tools like JIRA or Azure DevOps. - Excellent communication skills and strong problem-solving abilities. Please note that this job description is sourced from hirist.tech.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Database Engineer at our company, you will be responsible for designing, developing, and implementing complex database solutions. This includes creating data models, tables, indexes, and constraints to ensure efficient database operations. Your expertise in PL/SQL will be crucial as you develop code for stored procedures, functions, triggers, and packages to automate tasks and enforce business rules effectively. Database performance tuning will be a key aspect of your role, where you will identify bottlenecks and apply optimization strategies to enhance query performance and system efficiency. Monitoring the health of the database to prevent downtime and addressing potential issues promptly will also be part of your responsibilities. You will play a vital role in data integration and ETL processes by designing and implementing solutions that extract, transform, and load data from various sources into the target database. Ensuring data consistency and quality throughout the integration process will be essential for maintaining accurate and reliable information. Implementing robust security measures to safeguard sensitive data, including user access controls, encryption, and data masking, will be a priority. Staying updated on security best practices and industry standards will be crucial to maintaining a secure database environment. Collaboration and communication will be key skills required for this role as you collaborate with development teams, data analysts, and business stakeholders to understand their needs and provide effective solutions. Clear and concise communication of technical concepts to both technical and non-technical audiences will be essential. Required Skills and Experience: - Strong PL/SQL development and Oracle database administration skills - Proficiency in SQL, SQL tuning, and performance optimization methods - Experience with data modeling and database design - Knowledge of data integration and ETL processes - Understanding of database security best practices - Strong troubleshooting and problem-solving abilities - Ability to work independently and collaboratively Preferred Skills: - Experience with data warehousing and business intelligence concepts - Familiarity with cloud-based database systems like AWS RDS and Oracle Cloud Infrastructure - Knowledge of programming languages for automation such as Python and Perl - Certification in Oracle Database Administration or PL/SQL If you are a highly skilled database engineer with a passion for data and a strong understanding of Oracle technologies, we welcome you to apply for this exciting opportunity. Please note that this is a full-time position based in Bangalore, Chennai, Delhi, Hyderabad, Kolkata, Navi Mumbai, Pune, or Vadodara. The work mode is onsite, and the ideal candidate should have 6 to 8 years of experience in the field.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a skilled Python Developer with expertise in Postgres and Snowflake, you will be responsible for developing data-driven applications and implementing robust data solutions. Your role will involve designing and optimizing database schemas in Postgres, working with Snowflake for data warehousing and analytics, and collaborating with cross-functional teams to gather requirements and deliver effective solutions. If you are passionate about data engineering and enjoy solving complex problems, we want to hear from you! Key Responsibilities - Develop and maintain data applications using Python. - Design and optimize database schemas in Postgres. - Work with Snowflake for data warehousing and analytics. - Collaborate with cross-functional teams to gather requirements and deliver effective solutions. - Ensure the performance, quality, and responsiveness of applications. Key Requirements - 3-5 years of experience in Python development. - Strong knowledge of Postgres for database management. - Experience with Snowflake for data warehousing solutions. - Proficient in writing efficient SQL queries. - Familiarity with data modeling and ETL processes. Preferred Qualifications - Knowledge of data visualization tools (e.g., Tableau, Power BI) is a plus. - Understanding of cloud platforms (e.g., AWS, Azure) and data engineering best practices. - Exposure to Agile development methodologies.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The Senior Data Analytics Consultant role is a high-impact position that requires a combination of deep analytical expertise, advanced technological proficiency, and exceptional collaborative skills. Your primary responsibility will be to define and implement data strategies that enhance business outcomes through innovative analytics and predictive modeling. Proficiency in SQL, Power BI, Alteryx, and familiarity with data modeling, cloud data solutions, and data quality management practices are essential for success in this role. Your leadership and stakeholder management skills will be crucial in ensuring that data insights are effectively translated into actionable business strategies across the organization. The ideal candidate will possess refined problem-solving abilities, advanced technical skills, and a proven track record of making strategic impacts in the field of data analytics. With at least 6 years of total experience in data analytics, including a minimum of 5 years of hands-on experience in SQL, Alteryx, Power BI, and relational database concepts, you will be well-equipped to excel in this role. Expertise in Power BI, including Power Query, Data Modeling, Visualization, and DAX for creating complex calculations, measures, and custom columns, is crucial. Experience with Row Level Security (RLS) in Power BI, Alteryx for designing robust data workflows, and proficiency in data modeling concepts will be key aspects of your responsibilities. An understanding of cloud technologies and programming background will be advantageous for this role. As a Senior Data Analytics Consultant, you will spearhead data analytics initiatives, utilizing advanced data visualization and statistical techniques to uncover insights and optimize growth opportunities. Your responsibilities will include architecting and ensuring precision in data design, metrics, and analytics distributed to interdisciplinary stakeholders, as well as customizing dashboards and workflows using Power BI and Alteryx. Collaboration with various teams to understand processes, challenges, and customer needs, defining key performance indicators (KPIs), and leading projects will be integral parts of your role. Qualifications for this position include a Bachelor's degree (BE/BTECH) in Computer Science, Engineering, Data Science, or related fields. A Master's degree in Data Analytics, Data Science, Statistics, or relevant certifications such as CAP, DASCA, Microsoft Certified: Data Analyst Associate, or equivalent, are highly preferable. Your ability to lead projects, influence teams, and mentor junior analysts will be essential in this role.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
You should have an in-depth understanding of data management, including permissions, recovery, security, and monitoring. You must also possess strong experience in implementing data analysis techniques such as exploratory data profiling. Additionally, you should have a solid grasp of design patterns and hands-on experience in developing data pipelines for batch processing. Your role will require you to design and develop ETL processes that populate star schemas using various source data for data warehouse implementations supporting a product on both cloud and on-premise environments. You should be able to actively participate in the requirements gathering process and design business process dimensional models. Collaboration with data providers to address data gaps and make adjustments to source-system data structures for seamless analysis and integration with other company data will be a key responsibility. A basic understanding of scripting languages like Python is necessary for this role. Moreover, you should be skilled in both proactive and reactive performance tuning at the instance-level, database-level, and query-level to optimize data processing efficiency.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
We are looking for a Data Warehouse Engineer with a proven track record in developing Business Intelligence solutions to drive key outcomes. As a Data Warehouse Engineer, you will be responsible for various layers of the data hierarchy, from database design and data collection to a deep understanding of data transformation tools and methodologies. Your role will involve provisioning and managing analytical databases and building infrastructures that incorporate machine learning capabilities into production. You will be based in India and will operate within the Business Analytics unit under the Business Analytics Global Manager. As a Data Warehouse Engineer at Tek Experts, you will play a crucial role in developing data models, gathering data sources, creating ETL processes, and implementing front-end data model solutions to empower our business users to make informed decisions. You will have full responsibility for our Data Warehouse, including infrastructure, data modeling, and audit logging. Additionally, you will be tasked with building automated validation processes to maintain data integrity, optimizing processes for scalability, and taking ownership of designing, building, and deploying data products. Collaboration with Backend, Data Science, Data Analysis, and Product teams will also be a key aspect of your role. To qualify for this position, you should hold a Bachelor's degree in Information Systems, Industrial Engineering, or have equivalent experience. Professional fluency in English, both written and spoken, is essential. You should have three or more years of experience in BI solutions development, including Data Warehouse, ETL processes, SQL, and tools like SSIS\Informatica. Experience in building automated validation processes, data modeling, working with no SQL data platforms, ETL tools and methodologies, as well as BI reporting tools, is required. Strong analytical skills are also a must-have in this role. At Tek Experts, we are committed to diversity, equity, and inclusion. We provide fair employment opportunities without discrimination based on gender, ethnicity, socio-economic background, disability, marital status, or veteran status. Join us in championing our drive towards building an equitable opportunity environment. We do not request any sensitive personal data from our employees.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Lead Azure Data Engineer at CGI, you will have the opportunity to be part of a dynamic team of builders who are dedicated to helping clients succeed. With our global resources, expertise, and stability, we aim to achieve results for our clients and members. If you are looking for a challenging role that offers professional growth and development, this is the perfect opportunity for you. In this role, you will be responsible for supporting the development and maintenance of our trading and risk data platform. Your main focus will be on designing and building data foundations and end-to-end solutions to maximize the value from data. You will collaborate with other data professionals to integrate and enrich trade data from various ETRM systems and create scalable solutions to enhance the usage of TRM data across different platforms and teams. Key Responsibilities: - Implement and manage lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL). - Utilize SQL, Python, Apache Spark, and Delta Lake for data engineering tasks. - Implement data integration techniques, ETL processes, and data pipeline architectures. - Develop CI/CD pipelines for code management using GIT. - Create and maintain technical documentation for the platform. - Ensure the platform is developed with software engineering, data analytics, and data security best practices. - Optimize data processing and storage systems for high performance, reliability, and security. - Work in Agile Methodology and utilize ADO Boards for Sprint deliveries. - Demonstrate excellent communication skills to convey technical and business concepts effectively. - Collaborate with team members at all levels to share ideas and knowledge effectively. Required Qualifications: - Bachelor's degree in computer science or related field. - 6 to 10 years of experience in software development/engineering. - Proficiency in Azure technologies including Databricks, ADLS Gen2, ADF, and Azure SQL. - Strong hands-on experience with SQL, Python, Apache Spark, and Delta Lake. - Knowledge of data integration techniques, ETL processes, and data pipeline architectures. - Experience in building CI/CD pipelines and using GIT for code management. - Familiarity with Agile Methodology and ADO Boards for Sprint deliveries. At CGI, we believe in ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to turn meaningful insights into action, develop innovative solutions, and collaborate with a diverse team to shape your career and contribute to our collective success. Join us on this exciting journey of growth and innovation at one of the largest IT and business consulting services firms in the world.,
Posted 1 month ago
1.0 - 5.0 years
0 - 0 Lacs
hyderabad, telangana
On-site
As a BE with 1 to 3 years of databases experience and preferably ETL experience, you will be responsible for day-to-day tasks related to database management, ETL processes, data analysis, and ensuring data integrity and security in a full-time on-site role located in Hyderabad. Your role will involve utilizing your Database Management and Data Analysis skills, along with experience with ETL processes and knowledge of database systems and query languages to effectively carry out your responsibilities. Strong problem-solving and analytical skills will be essential for successfully addressing database-related challenges. Furthermore, your good communication and teamwork abilities will play a crucial role in collaborating with team members and stakeholders to achieve project objectives. A Bachelor's degree in Computer Science, Information Technology, or a related field is required for this position. Any additional experience in the banking sector would be advantageous. The salary offered for this position ranges from 40,000 to 60,000 INR along with a travel allowance for 15 days of travel.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. As an ETL Technical Lead at Orion Innovation located in Chennai, you are required to have at least 5 years of ETL experience and 3 years of experience specifically in Azure Synapse. Your role will involve designing, developing, and managing ETL processes within the Azure ecosystem. You must possess proficiency with Azure Synapse Pipelines, Azure Dedicated SQL Pool, Azure Data Lake Storage (ADLS), and other related Azure services. Additionally, experience with audit logging, data governance, and implementing data integrity and data lineage best practices is essential. Your responsibilities will include leading and managing the ETL team, providing mentorship, technical guidance, and driving the delivery of key data initiatives. You will design, develop, and maintain ETL pipelines using Azure Synapse Pipelines for ingesting data from various file formats and securely storing them in Azure Data Lake Storage (ADLS). Furthermore, you will architect, implement, and manage data solutions following the Medallion architecture for effective data processing and transformation. It is crucial to leverage Azure Data Lake Storage (ADLS) to build scalable and high-performance data storage solutions, ensuring optimal data lake management. You will also be responsible for managing the Azure Dedicated SQL Pool to optimize query performance and scalability. Automation of data workflows and processes using Logic Apps, as well as ensuring secure and compliant data handling through audit logging and access controls, will be part of your duties. Collaborating with data scientists to integrate ETL pipelines with Machine Learning models for predictive analytics and advanced data science use cases is key. Troubleshooting and resolving complex data pipeline issues, monitoring and optimizing performance, and acting as the primary technical point of contact for the ETL team are also essential aspects of this role. Orion Systems Integrators, LLC and its affiliates are committed to protecting your privacy. For more information on the Candidate Privacy Policy, please refer to the official documentation on the Orion website.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The ideal candidate for the role of VP Data Science should have a strong background in data analytics, business intelligence, and management. As a visionary leader, you will drive the Data and Analytics, BI function to facilitate data-driven decision-making across the organization. Your responsibilities will include overseeing the development and implementation of machine learning predictive models, BI tools, and systems. It is crucial to ensure data accuracy and integrity while providing actionable insights to various departments. Effective communication, a strategic mindset, and the ability to collaborate with cross-functional teams are essential for success in this role, as you will play a critical part in helping the organization leverage data to achieve its business goals and objectives. Your key responsibilities will involve developing and implementing the overall Data and Analytics, BI strategy, as well as supervising the design, development, and maintenance of predictive models, BI tools, and systems. You will be accountable for ensuring data accuracy, integrity, and security, and providing actionable insights to support business decision-making. Collaboration with cross-functional teams to comprehend their data needs, managing and mentoring a team of data analytics professionals, setting performance goals, and conducting regular performance reviews are also part of your role. Staying updated with the latest trends and technologies, developing and maintaining data governance policies, and delivering presentations to senior management are critical responsibilities. Additionally, monitoring and reporting on key performance indicators (KPIs), identifying opportunities for process improvements, and ensuring compliance with data privacy regulations will be essential in this position. To qualify for this role, you should possess a Bachelor's degree in Computer Science, Information Systems, or a related field, with a Master's degree being preferred. Prior experience in business intelligence/data analytics and a leadership role is required. Proficiency in R/Python, Machine Learning, Databases, Dashboards, as well as a strong understanding of Data Science, Machine Learning, and Data Analytics, are necessary. Familiarity with BI tools and systems, excellent analytical and problem-solving skills, and strong communication and presentation abilities are also essential. Experience with data warehousing, ETL processes, SQL, and other database query languages is expected. The ability to work collaboratively with cross-functional teams, strong project management skills, and knowledge of data governance and data privacy regulations are crucial. Experience in Banking and Financial services, particularly in predictive modeling of regulatory and non-regulatory credit risk domain, would be advantageous. In conclusion, the VP Data Science role demands a dynamic individual who can effectively lead the Data and Analytics, BI function, drive strategic decision-making through data insights, and contribute significantly to the organization's success in achieving its business objectives.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
We are seeking an experienced software engineer to join our team at Grid Dynamics. The ideal candidate should be proficient in Java, Spring Framework, and either Google Cloud Platform (GCP) or Azure. Hands-on experience with BigQuery, Apache Kafka, and GitHub/GitHub Actions is preferred, along with a strong background in developing RESTful APIs. If you are passionate about working with cutting-edge cloud technologies and building scalable solutions, we would love to connect with you! The ideal candidate should have at least 6-9 years of experience in Java and extensive expertise in the Spring Boot Framework. A strong background working with either MS Azure, Google Cloud Platform (GCP), or AWS is required, along with a solid understanding of data integration patterns and ETL processes. Experience with unit and integration testing (e.g., JUnit, Mockito) is essential for this role, along with knowledge of distributed systems architecture. Strong analytical and problem-solving skills are necessary to tackle complex challenges effectively. Immediate joiners are preferred for this position, and it can be hired across Bangalore, Hyderabad, or Chennai.,
Posted 1 month ago
1.0 - 3.0 years
0 Lacs
, India
On-site
Qualification: Education: Bachelors degree in any field. Experience: Minimum 1.5-2 years of experience in data engineering support or a related role, with hands-on exposure to AWS. Technical Skills: Strong understanding of AWS services, including but not limited to S3, EC2, CloudWatch, and IAM. Proficiency in SQL with the ability to write, optimize, and debug queries for data analysis and issue resolution. Hands-on experience with Python for scripting and automation; familiarity with Shell scripting is a plus. Good understanding of ETL processes and data pipelines. Exposure to data warehousing concepts; experience with Amazon Redshift or similar platforms preferred. Working knowledge of orchestration tools, especially Apache Airflow including monitoring and basic troubleshooting. Soft Skills: Strong communication and interpersonal skills for effective collaboration with cross-functional teams and multi-cultural teams. Problem-solving attitude with an eagerness to learn and adapt quickly. Willingness to work in a 24x7 support environment on a 6-day working schedule, with rotational shifts as required. Language Requirements: Must be able to read and write in English proficiently. Show more Show less
Posted 1 month ago
3.0 - 4.0 years
5 - 6 Lacs
Noida, Gurugram, Bengaluru
Work from Office
Senior Engineer: The T C A practice has experienced significant growth in demand for engineering & architecture roles from CST, driven by client needs that extend beyond traditional data & analytics architecture skills. There is an increasing emphasis on deep technical sk ills like s uch as strong ex pertise i n Azure, Snowflake, Azure OpenAI, and Snowflake Cortex, along with a solid understanding of their respective functionalities. In dividual w ill work on a robust pipeline of T C A-driven projects with pharma clients . This role offers significant opportunities for progression within the practice. What Youll Do Opportunity to work on high-impact projects with leading clients. Exposure to complex and technological initiatives Learning support through organization sponsored trainings & certifications Collaborative and growth-oriented team culture. Clear progression path within the practice. Opportunity work on latest technologies Successful delivery of client projects and continuous learning mindset certifications in newer areas Contribution to partner with project leads and AEEC leads todeliver complex projects & growTCA practice. Development of experttech solutions for client needs with positive feedback from clients and team members. What Youll Bring 3- 4 years of experience in RDF ontologies, RDF based knowledge graph (Anzo graph DB preferred), Data modelling, Azure cloud and data engineering Understanding of ETL processes, Data pull using Azure services via polling mechanism and API/middleware development using Azure services. Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus.
Posted 1 month ago
12.0 - 16.0 years
10 - 14 Lacs
Pune
Work from Office
IT MANAGER, DATA ENGINEERING AND ANALYTICS will lead a team of data engineers and analysts responsible for designing, developing, and maintaining robust data systems and integrations. This role is critical for ensuring the smooth collection, transformation, integration and visualization of data, making it easily accessible for analytics and decision-making across the organization. The Manager will collaborate closely with analysts, developers, business leaders and other stakeholders to ensure that the data infrastructure meets business needs and is scalable, reliable, and efficient. What Youll Do: Team Leadership: Manage, mentor, and guide a team of data engineers and analysts, ensuring their professional development and optimizing team performance. Foster a culture of collaboration, accountability, and continuous learning within the team. Lead performance reviews, provide career guidance, and handle resource planning. Data Engineering & Analytics: Design and implement data pipelines, data models, and architectures that are robust, scalable, and efficient. Develop and enforce data quality frameworks to ensure accuracy, consistency, and reliability of data assets. Establish and maintain data lineage processes to track the flow and transformation of data across systems. Ensure the design and maintenance of robust data warehousing solutions to support analytics and reporting needs. Collaboration and Stakeholder Management: Collaborate with stakeholders, including functional owners, analysts and business leaders, to understand business needs and translate them into technical requirements. Work closely with these stakeholders to ensure the data infrastructure supports organizational goals and provides reliable data for business decisions. Build and Foster relationships with major stakeholders to ensure Management perspectives on Data Strategy and its alignment with Business objectives. Project Management: Drive end-to-end delivery of analytics projects, ensuring quality and timeliness. Manage project roadmaps, prioritize tasks, and allocate resources effectively. Manage project timelines and mitigate risks to ensure timely delivery of high-quality data engineering projects. Technology and Infrastructure: Evaluate and implement new tools, technologies, and best practices to improve the efficiency of data engineering processes. Oversee the design, development, and maintenance of data pipelines, ensuring that data is collected, cleaned, and stored efficiently. Ensure there are no data pipeline leaks and monitor production pipelines to maintain their integrity. Familiarity with reporting tools such as Superset and Tableau is beneficial for creating intuitive data visualizations and reports. Machine Learning and GenAI Integration: Machine Learning: Knowledge of machine learning concepts and integration with data pipelines is a plus. This includes understanding how machine learning models can be used to enhance data quality, predict data trends, and automate decision-making processes. GenAI: Familiarity with Generative AI (GenAI) concepts and exposure is advantageous, particularly in enabling GenAI features on new datasets. Leveraging GenAI with data pipelines to automate tasks, streamline workflows, and uncover deeper insights is beneficial. What Youll Bring: 12+ years of experience in data engineering, with at least 3 years in a managerial role. Technical Expertise: Strong knowledge of data engineering concepts, including data warehousing, ETL processes, and data pipeline design. Proficiency in Azure Synapse or data factory, SQL, Python, and other data engineering tools. Data Modeling: Expertise in data modeling is essential, with the ability to design and implement robust, scalable data models that support complex analytics and reporting needs. Experience with data modeling frameworks and tools is highly valued. Leadership Skills: Proven ability to lead and motivate a team of engineers while managing cross-functional collaborations. Problem-Solving: Strong analytical and troubleshooting skills to address complex data-related challenges. Communication: Excellent verbal and written communication skills to effectively interact with technical and non-technical stakeholders. This includes the ability to motivate team members, provide regular constructive feedback, and facilitate open communication channels to ensure team alignment and success. Data Architecture: Experience with designing scalable, high-performance data systems and understanding cloud platforms such as Azure, Data Bricks. Machine Learning and GenAI: Knowledge of machine learning concepts and integration with data pipelines, as well as familiarity with GenAI, is a plus. Data Governance: Experience with data governance best practices is desirable. Open Mindset: An open mindset with a willingness to learn new technologies, processes, and methodologies is essential. The ability to adapt quickly to evolving data engineering landscapes and embrace innovative solutions is highly valued.
Posted 1 month ago
6.0 - 9.0 years
6 - 186 Lacs
Noida, Uttar Pradesh, India
On-site
Description We are seeking a skilled SAP Native HANA developer to join our team in India. The ideal candidate will have a strong background in SAP HANA development and be responsible for creating and optimizing applications within the SAP HANA platform. Responsibilities Develop and implement SAP HANA Native applications Optimize database performance and ensure high availability Collaborate with cross-functional teams to gather requirements and develop solutions Design data models and prepare data for analysis Provide technical support and troubleshooting for SAP HANA environments Stay updated with the latest SAP technologies and best practices Skills and Qualifications 6-9 years of experience in SAP HANA Native development Strong knowledge of SQL and database management Experience with data modeling and ETL processes Familiarity with SAP HANA Studio and SAP Business Objects Understanding of application development using XSJS and SQLScript Proficiency in performance tuning and optimization techniques Excellent problem-solving skills and attention to detail
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |