Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
6 - 9 Lacs
Mumbai, Mumbai Suburban, Mumbai (All Areas)
Work from Office
1. We are seeking a Min 3 years of skilled SQL + Python Developer to join our dynamic team. 2. This role involves a mix of database development, administration, and data engineering tasks. 3. Design and implement ETL processes for Data Integration. Required Candidate profile 1.The ideal candidate will have a strong background in SQL, PL/SQL, and Python scripting. 2.Proven expertise in SQL query tuning and database performance optimization ,Snowflake Data Warehouse. Perks and benefits To be disclosed post intrview
Posted 1 month ago
1.0 - 3.0 years
1 - 5 Lacs
Chennai
Work from Office
hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role.. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast.. Job Summary. We are looking for an experienced and proactive ETL Lead to oversee and guide our ETL testing and data validation efforts. This role requires a deep understanding of ETL processes, strong technical expertise in tools such as SQL, Oracle, MongoDB, AWS, and Python/Pyspark, and proven leadership capabilities. The ETL Lead will be responsible for ensuring the quality, accuracy, and performance of our data pipelines while mentoring a team of testers and collaborating with cross-functional stakeholders.. Job Description. Key Responsibilities:. Lead the planning, design, and execution of ETL testing strategies across multiple projects.. Oversee the development and maintenance of test plans, test cases, and test data for ETL processes.. Ensure data integrity, consistency, and accuracy across all data sources and destinations.. Collaborate with data engineers, developers, business analysts, and project managers to define ETL requirements and testing scope.. Mentor and guide a team of ETL testers, providing technical direction and support.. Review and approve test deliverables and ensure adherence to best practices and quality standards.. Identify and resolve complex data issues, bottlenecks, and performance challenges.. Drive continuous improvement in ETL testing processes, tools, and methodologies.. Provide regular status updates, test metrics, and risk assessments to stakeholders.. Stay current with emerging trends and technologies in data engineering and ETL testing.. Requirements. 6+ years of experience in ETL testing, with at least 2 years in a lead or senior role.. Strong expertise in ETL concepts, data warehousing, and data validation techniques.. Hands-on experience with Oracle, MongoDB, AWS services (e.g., S3, Redshift, Glue), and Python/Pyspark scripting.. Advanced proficiency in SQL and other query languages.. Proven ability to lead and mentor a team of testers.. Excellent problem-solving, analytical, and debugging skills.. Strong communication and stakeholder management abilities.. Experience with Agile/Scrum methodologies is a plus.. Ability to manage multiple priorities and deliver high-quality results under tight deadlines.. Disclaimer. This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life.. Education. Bachelor's Degree. While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.. Relevant Work Experience. 7-10 Years. Show more Show less
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Description. Tietoevry Create is seeking a Senior Analytics Engineer to join our dynamic team in Bengaluru, India. In this role, you will lead the development and implementation of advanced analytics solutions, driving data-driven decision-making across the organization.. Design and develop scalable, efficient, and reliable data pipelines and ETL processes. Collaborate with cross-functional teams to identify and implement analytics solutions that address business needs. Lead the architecture and implementation of data models and analytics platforms. Mentor junior team members and promote best practices in analytics engineering. Optimize data retrieval and processing to improve system performance. Implement and maintain data quality controls and governance processes. Stay up-to-date with emerging technologies and industry trends in data analytics. Contribute to the development of data strategies and roadmaps. Provide technical expertise and support for complex analytics projects. Qualifications. Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field. 5+ years of experience in analytics engineering or a related field. Advanced knowledge of data analytics tools and platforms. Proficiency in SQL, Python, and/or R. Experience with big data technologies such as Hadoop and Spark. Strong understanding of data modeling and ETL processes. Familiarity with cloud platforms (AWS, Azure, or GCP). Knowledge of machine learning algorithms and statistical analysis. Proven track record of leading analytics projects and mentoring junior team members. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Understanding of business intelligence and data-driven decision making. Knowledge of data governance and privacy regulations. Relevant certifications (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer) are a plus. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Experience: 3+ Years. As a Senior Data Engineer, you’ll build robust data pipelines and enable data-driven decisions by developing scalable solutions for analytics and reporting. Perfect for someone with strong database and ETL expertise.. Job Responsibilities-. Design, build, and maintain scalable data pipelines and ETL processes.. Work with large data sets from diverse sources.. Develop and optimize data models, warehouses, and integrations.. Collaborate with data scientists, analysts, and product teams.. Ensure data quality, security, and compliance standards.. Qualifications-. Proficiency in SQL, Python, and data pipeline tools (Airflow, Spark).. Experience with data warehouses (Redshift, Snowflake, BigQuery).. Knowledge of cloud platforms (AWS/GCP/Azure).. Strong problem-solving and analytical skills.. Show more Show less
Posted 1 month ago
5.0 - 8.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Date 17 Jun 2025 Location: Bangalore, IN Company Alstom Req ID:486332 STRUCTURE, REPORTING, NETWORKS & LINKS: Organization Structure CITO |-- Data & AI Governance Vice President |-- Enterprise Data Domain Director |-- Head of Analytics Platform |-- Analytics Delivery Architect |-- Analytics Technical Specialist Organizational Reporting Reports to Delivery Manager Networks & Links Internally Transversal Digital Platforms Team. Innovation Team, Application Platform Owners, Business process owners, Infrastructure team Externally Third-party technology providers, Strategic Partners Location :Position will be based in Bangalore Willing to travel occasionally for onsite meetings and team workshops as required RESPONSIBILITIES - Design, develop, and deploy interactive dashboards and reports using MS Fabric & Qlik Cloud, ensuring alignment with business requirements and goals. Implement and manage data integration workflows utilizing MS Fabric to ensure efficient data processing and accessibility. Translate business needs to technical specifications and Design, build and deploy solutions. Understand and integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), Teams, SharePoint or by API automation. Will be responsible for access management of app workspaces and content. Integration of PowerBi servers with different data sources and timely upgradation/services of PowerBi Able to schedule and refresh jobs on Power BI On-premise data gateway. Configure standard system reports, as well as customized reports as required. Responsible in helping various kind of database connections (SQL, Oracle, Excel etc.) with Power BI Services Investigate and troubleshoot reporting issues and problems Maintain reporting schedule and document reporting procedures Monitor and troubleshoot data flow issues, optimizing the performance of MS Fabric applications as needed. Optimize application performance and data models in Qlik Cloud while ensuring data accuracy and integrity. Ensure collaboration with Functional & Technical Architectsasbusiness cases are setup for each initiative,collaborate with other analytics team to drive and operationalize analytical deployment.Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. Ensure compliance with internal policies and regulations Strong ability to take the lead and be autonomous Proven planning, prioritization, and organizational skills.Ability to drive change through innovation & process improvement. Be able to report to management and stakeholders in a clear and concise manner. Good to havecontribition to the integration and utilization of Denodo for data virtualization, enhancing data access across multiple sources. Document Denodo processes, including data sources and transformations, to support knowledge sharing within the team. Facilitate effective communication with stakeholders regarding project updates, risks, and resolutions to ensure transparency and alignment. Participate in team meetings and contribute innovative ideas to improve reporting and analytics solutions. EDUCATION Bachelors/Masters degree in Computer Science Engineering /Technology or related field Experience Total 6 years of experience Mandatory 2+ years of experience in Power BI End-to-End Development using Power BI Desktop connecting multiple data sources (SAP, SQL, Azure, REST APIs, etc.) Experience in MS Fabric Components along with Denodo. Technical competencies Proficient in using MS Fabric for data integration and automation of ETL processes. Understanding of data governance principles for quality and security. Strong expertise in creating dashboards and reports using Power BI and Qlik. Knowledge of data modeling concepts in Qlik and Power BI. Proficient in writing complex SQL queries for data extraction and analysis. Skilled in utilizing analytical functions in Power BI and Qlik. Experience in troubleshooting performance issues in MS Fabric and Denodo. Experience in Developing visual reports, dashboards and KPI scorecards using Power BI desktop & Qlik Understand Power BI application security layer model. Hands on PowerPivot, Role based data security, Power Query, Dax Query, Excel, Pivots/Charts/grid and Power View Good to have Power BI Services and Administration knowledge. Experience in developing data models using Denodo to support business intelligence and analytics needs. Proficient in creating base views and derived views for effective data representation. Ability to implement data transformations and enrichment within Denodo. Skilled in using Denodo's SQL capabilities to write complex queries for data retrieval. Familiarity with integrating Denodo with various data sources, such as databases, web services, and big data platforms. BEHAVIORAL COMPETENCIES The candidate should demonstrate: A strong sense for collaboration and being a team player Articulate issues and propose solutions. Structured thought process and articulation Critical thinking and problem-solving skills. Analytical bent of mind and be willing to question the status quo Possess excellent soft skills. Individual contributor and proactive and have leadership skills. Be able to guide and drive team from technical standpoint. Excellent written, verbal, and interpersonal skills. Self-motivated, quick learner is a must. Be fluent in English. Be able to influence and deliver. You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 month ago
4.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Date 19 Jun 2025 Location: Bangalore, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. OVERALL PURPOSE OF THE ROLE He/She will act as an Anaplan expert and ensure the Platform is suitable for user adoption as per the Business requirements and also compliant with security Alstom standards at all times. He/She will play a crucial role in implementing an architecture that is optimized for performance & storage. He/She is expected to lead/coordinate end to end Delivery on the Projects/demands. In addition, He/She will be responsible for tracking the users and manage licenses to ensure compliance with the contractual objectives. STRUCTURE, REPORTING, NETWORKS & LINKS Organization Structure CITO |-- VP Data & AI Governance |-- Enterprise Data Domain Director |--Head of Analytics Platform |--Analytics Delivery Architect |--Analytics Technical Analyst Organizational Reporting: Reports to Head of Analytics Platform.. Networks & Links InternallyDigital Platforms Team, InnovationTeam, ApplicationPlatform Owners, Business process owners, Infrastructure team ExternallyThird-party technology providers, Strategic Partners Location Position will be based in Bangalore RESPONSIBILITIES - Design, develop, and deploy interactive dashboards and reports using MS Fabric & Qlik Cloud (Good to Have), ensuring alignment with business requirements and goals. Implement and manage data integration workflows utilizing MS Fabric to ensure efficient data processing and accessibility. Use Python scripts to automate data cleaning and preprocessing tasks for Data models. Understand and integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), Teams, SharePoint or by API automation. Will be responsible for access management of app workspaces and content. Integration of PowerBi servers with different data sources and timely upgradation/services of PowerBi Able to schedule and refresh jobs on Power BI On-premise data gateway. Configure standard system reports, as well as customized reports as required. Responsible in helping various kind of database connections (SQL, Oracle, Excel etc.) with Power BI Services Investigate and troubleshoot reporting issues and problems Maintain reporting schedule and document reporting procedures Monitor and troubleshoot data flow issues, optimizing the performance of MS Fabric applications as needed. Ensure collaboration with Functional & Technical Architectsasbusiness cases are setup for each initiative,collaborate with other analytics team to drive and operationalize analytical deployment. Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. Ensure compliance with internal policies and regulations Strong ability to take the lead and be autonomous Proven planning, prioritization, and organizational skills.Ability to drive change through innovation & process improvement. Be able to report to management and stakeholders in a clear and concise manner. Good to havecontribution to the integration and utilization of Denodo for data virtualization, enhancing data access across multiple sources. Facilitate effective communication with stakeholders regarding project updates, risks, and resolutions to ensure transparency and alignment. Participate in team meetings and contribute innovative ideas to improve reporting and analytics solutions. EDUCATION Bachelors/Masters degree in Computer Science Engineering /Technology or related field Experience Minimum 3 and maximum 5 years of total experience Mandatory 2+ years of experience in MS Fabric Power BI End-to-End Development using Power BI Desktop connecting multiple data sources (SAP, SQL, Azure, REST APIs, etc.) Handson Experience in Python, R, and SQL for data manipulation, analysis, data pipeline and database interaction. Experience or Knowledge in using PySpark or Jupiter Notebook for data cleaning, transformation, exploration, visualization, and building data models on large datasets. Technical competencies Proficient in using MS Fabric for data integration and automation of ETL processes. Knowlege in using Pyspark Modules for Data Modelling (Numpy, Panda) Handson & in using Python , R Programming language for data processing. Understanding of data governance principles for quality and security. Strong expertise in creating dashboards and reports using Power BI and Qlik. Knowledge of data modeling concepts in Qlik and Power BI. Proficient in writing complex SQL queries for data extraction and analysis. Skilled in utilizing analytical functions in Power BI and Qlik. Experience in Developing visual reports, dashboards and KPI scorecards using Power BI desktop & Qlik Hands on PowerPivot, Role based data security, Power Query, Dax Query, Excel, Pivots/Charts/grid and Power View Good to have Power BI Services and Administration knowledge. Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 month ago
2.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Job Purpose Data Analyst plays a crucial lead role in managing and optimizing business intelligence solutions using Power BI. Leadership and StrategyLead the design, development, and deployment of Power BI reports and dashboards. Provide strategic direction for data visualization and business intelligence initiatives. Interface with Business Owner, Project Manager, Planning Manager, Resource Managers etc. Develop roadmap for execution of complex data analytics projects. Data Modeling and IntegrationDevelop complex data models, establish relationships, and ensure data integrity. Oversee data integration from various sources. Advanced AnalyticsPerform advanced data analysis using DAX (Data Analysis Expressions) and other analytical tools to derive insights and support decision-making. CollaborationWork closely with stakeholders to gather requirements, define data needs, and ensure the delivery of high-quality BI solutions. Performance OptimizationOptimize solutions for performance, ensuring efficient data processing and report rendering. MentorshipMentor and guide junior developers, providing technical support and best practices for Power BI development. Data SecurityImplement and maintain data security measures, ensuring compliance with data protection regulations. Demonstrated experience of leading complex projects with a team of varied experience levels. You are meant for this job if: Educational BackgroundBachelors or Masters degree in Computer Science, Information Systems, or a related field. Experience in working with unstructured data and data integration. Technical Skills: Proficiency in Power BI, DAX, SQL, and data modeling, exposure to data engineering. Experience with data integration tools and ETL processes. Hands-on experience with Snowflake Experience7-8 years of experience in business intelligence and data analytics, with a focus on Power BI. Soft Skills: Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to lead and collaborate with global cross-functional teams. Skills Change Leadership Process Mapping
Posted 1 month ago
5.0 - 8.0 years
10 - 20 Lacs
Chennai, Bengaluru
Work from Office
ODI Developer Chennai/Bangalore. WFO only. 5-8 years of experience as an ETL Developer, with hands-on expertise in Oracle Data Integrator (ODI). ODI expertise is must, ETL with informatica and any other tools except ODI will be rejected. Proficiency in Oracle Database and MySQL, with strong skills in SQL and PL/SQL development. Experience in data integration, transformation, and loading from heterogeneous data sources. Strong understanding of data modeling concepts and ETL best practices. Familiarity with performance tuning and troubleshooting of ETL processes. Knowledge of scripting languages (e.g., Python, JavaScript) for automation is a plus. Excellent analytical and problem-solving skills. Strong communication skills to work effectively with cross-functional teams. Please call varsha 7200847046 for more Info Regards varsha 7200847046
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Chennai, Bengaluru
Work from Office
5+years of experience as an ETL Developer, with hands-on expertise in (ODI). Proficiency in Oracle Database and MySQL, with strong skills in SQL & PL/SQL Experience in data integration, transformation, and loading from heterogeneous data sources.
Posted 1 month ago
4.0 - 6.0 years
4 - 9 Lacs
Gurugram
Work from Office
As a key member of the DTS team, you will primarily collaborate closely with a leading global hedge fund on data engagements foundation in building modern, responsive web applications using Blazor and MudBlazor. In this role, you will be instrumental in shaping the user experience of our applications, working closely with cross-functional teams to deliver high-quality, scalable, and maintainable UI components. Desired Skills and Experience Essential skills Bachelors or masters degree in computer science, Engineering, or a related field. 4-6 years of experience in data engineering, with a strong background in building and maintaining data pipelines and ETL processes. Proven experience with Blazor and MudBlazor, or strong willingness to learn. Solid understanding of modern JavaScript frameworks (e.g., React, Angular, Vue). Strong grasp of HTML, CSS, and responsive design principles. Experience working in collaborative, agile development environments. Familiarity with accessibility standards and frontend performance optimization. Experience with Razor components and .NET backend integration. Exposure to unit testing and automated UI testing tools. Key Responsibilities Build and maintain responsive, reusable UI components in Blazor and MudBlazor. Translate UI/UX mockups and business requirements into functional frontend features. Work closely with backend engineers to ensure smooth API integrations. Participate in code reviews and collaborate on frontend design patterns and best practices. Investigate and resolve UI bugs and performance issues. Contributes to maintaining consistency, accessibility, and scalability of the frontend codebase. Collaborate with QA and DevOps to support testing and deployment pipelines. Stay current with frontend trends and technologies, particularly in .NET and Blazor ecosystems. Our current stack includes C#, .NET 5+, Blazor, and the MudBlazor component library. Developers with strong experience in modern JavaScript frameworks (such as React, Angular, or Vue) and a willingness to quickly learn Blazor and Razor components. Key Metrics C#, .NET 5+, Blazor UI Library: MudBlazor Git, CI/CD, Agile/Scrum Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 1 month ago
6.0 - 8.0 years
12 - 17 Lacs
Gurugram
Work from Office
As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills A bachelors degree in computer science, engineering, mathematics, or statistics 6-8 years of experience in a Data Engineering role, with a proven track record of delivering insightful and value add dashboards Experience writing Advanced SQL queries, Python and a deep understanding of relational databases Experience working within an Azure environment Experience with Tableau, Holland Mountain ATLAS is a plus. Experience with master data management and data governance is a plus. Ability to prioritize multiple projects simultaneously, problem solve, and think outside the box Key Responsibilities Develop, test and release Data packages for Tableau Dashboards to support all business functions, including investments, investor relations, marketing and operations Support ad hoc requests, including the ability to write queries and extract data from a data warehouse Assist with the management and maintenance of an Azure environment Maintain a data dictionary, which includes documentation of database structures, ETL processes and reporting dependencies Key Metrics Python, SQL Data Engineering, Azure and ATLAS Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 1 month ago
6.0 - 8.0 years
7 - 16 Lacs
Pune
Work from Office
Proficient in SQL and Data Warehouse. Good understanding of ETL process, Data Governance, Data Quality Framework, and Data control framework. Understanding of legal, Risk, and compliance domain or at least strong in either one Data details & accuracy
Posted 1 month ago
5.0 - 9.0 years
4 - 7 Lacs
Gurugram
Work from Office
Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization
Posted 1 month ago
8.0 - 12.0 years
11 - 18 Lacs
Faridabad
Remote
We are seeking an experienced and highly skilled Senior Data Scientist to drive data-driven decision-making and innovation. In this role, you will leverage your expertise in advanced analytics, machine learning, and big data technologies to solve complex business challenges. You will be responsible for designing predictive models, building scalable data pipelines, and uncovering actionable insights from structured and unstructured datasets. Collaborating with cross-functional teams, your work will empower strategic decision-making and foster a data-driven culture across the organization. Role & responsibilities 1. Data Exploration and Analysis: Collect, clean, and preprocess large and complex datasets from diverse sources, including SQL databases, cloud platforms, and APIs. Perform exploratory data analysis (EDA) to identify trends, patterns, and relationships in data. Develop meaningful KPIs and metrics tailored to business objectives. 2. Advanced Modeling and Machine Learning: Design, implement, and optimize predictive and prescriptive models using statistical techniques and machine learning algorithms. Evaluate model performance and ensure scalability and reliability in production. Work with both structured and unstructured data for tasks such as text analysis, image processing, and recommendation systems. 3. Data Engineering and Automation: Build and optimize scalable ETL pipelines for data processing and feature engineering. Collaborate with data engineers to ensure seamless integration of data science solutions into production environments. Leverage cloud platforms (e.g., AWS, Azure, GCP) for scalable computation and storage. 4. Data Visualization and Storytelling: Communicate complex analytical findings effectively through intuitive visualizations and presentations. Create dashboards and visualizations using tools such as Power BI, Tableau, or Python libraries (e.g., Matplotlib, Seaborn, Plotly). Translate data insights into actionable recommendations for stakeholders. 5. Cross-functional Collaboration and Innovation: Partner with business units, product teams, and data engineers to define project objectives and deliver impactful solutions. Stay updated with emerging technologies and best practices in data science, machine learning, and AI. Contribute to fostering a data-centric culture within the organization by mentoring junior team members and promoting innovative approaches. Preferred candidate profile Proficiency in Python, R, or other data science programming languages. Strong knowledge of machine learning libraries and frameworks (e.g., Scikit-learn, Tensor Flow, PyTorch). Advanced SQL skills for querying and managing relational databases. Experience with big data technologies (e.g., Spark, Hadoop) and cloud platforms (AWS, Azure, GCP), preferably MS Azure. Familiarity with data visualization tools such as Power BI, Tableau, or equivalent, preferably MS Power BI. Analytical and Problem-solving Skills: Expertise in statistical modeling, hypothesis testing, and experiment design. Strong problem-solving skills to address business challenges through data-driven solutions. Ability to conceptualize and implement metrics/KPIs tailored to business needs. Soft Skills: Excellent communication skills to translate complex technical concepts into business insights. Collaborative mindset with the ability to work in cross-functional teams. Proactive and detail-oriented approach to project management and execution. Education and Experience: Bachelors or Masters degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. 8+ years of experience in data science, advanced analytics, or a similar field. Proven track record of deploying machine learning models in production environments. Perks & Benefits Best as per market standard. Work from home opportunity. 5 days working. Shift Timing 2PM-11PM IST (Flexible hours)
Posted 1 month ago
4.0 - 9.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Date 28 May 2025 Location: Bangalore, KA, IN Company Alstom Req ID:486780 We create smart innovations to meet the mobility challenges oftoday and tomorrow. We design, manufacture and support a complete range of transportation systems, from high-speed trains to electric busesand driverless trains, as well asinfrastructure, signalling and digital mobility solutions. Joining us meansjoininga truly global community ofmore than75 000 people dedicated to solving real-world mobility challenges and achieving international projects with sustainable local impact. OVERALL PURPOSE OF THE ROLE Alstom Services offers a wide range of business solutions for Rolling Stocks (train, metro, tramway), Signalling and Infrastructure productsmaintenance, modernization, technical support and spare parts services. The Services Business Solutions department, part of Digital Transformation organization, is responsible to define the architecture, to design, to build and to operate the IT solutions supporting the Alstom Services business processes. Alstom provides several solutions to support Services business activities based on Salesforce products. The role of the TechnicalAnalyst of Salesforce Services Solutions is to contribute to the design, to control the development and technical architecture of these applications making sure the solutions follow the defined core model, and support the rollout on the Alstom Services projects worldwide. As a Technical Analyst you will work closely with the Salesforce functional expert and Technical expert andinteract with other IS&T departments such as ERP, Engineering and Digital to ensure the end-to-end consistency of the IS&T landscape for the Customer Services Domain. RESPONSIBILITIES As a Salesforce Services Technical Analyst, you are in charge of Managing and Implementing the technical design of Salesforce Service solutions in accordance with Business strategy and working closely with the Technical & Functional Expert Ensuring that Salesforce Services Solutions Technical Core Model (IT Solutions ) is well documented based on Business Processes & Rules, evolves consistently and is not jeopardized by localization Present analysis and detailed solution documentation to both technical and non-technical audiences Lead develop, package & release management activities and support development team (includes reviewing deliverables, effort estimations for multiple design options) Contributing to the d eliver y of Salesforce Services Core Model evolution in accordance with business strategy Supporting applications Roll-out, ensuring that Core Model is applied Interacting with Operations teams by monitoring their activities & providing technical expertise when needed To be able to execute all the responsibilities above, you will be working with internal and external IS&T partners. In addition, you will be constantly liaising between partners and business to contribute to the project successes.Your technical expertise will be needed to deliver the different projects with high quality and on time and to efficiently support our critical applications. BEHAVIORAL COMPETENCIES You are action-oriented person with strong analytical and problem-solving skills. You are a self-starter and result oriented person. Excellent written, verbal and interpersonal skills are a must because you will need to work autonomously in a worldwide & multicultural environment. You should be agile to handle multiple tasks efficiently, keeping the big picture, and effectively delivering outcomes in a fast-paced environment. TECHNICAL COMPETENCIES & EXPERIENCE Language skillsEnglish expertise is a must both spoken and written Graduated with an engineering degree, preferably in information technology 4 years of experience on Salesforce, including Salesforce Lightning, Service Cloud, Experience cloud, Commerce Cloud 2+ years of experience and proficiency with Salesforce developer toolkit, including Apex, Test classes, Triggers, Visualforce, Javascript, SOQL/SOSL and Release Migration Tool 1+ years of experience with Salesforce architecture, design and integration technologies (including Platform Events, Connected Apps, Mobile SDK, SSO, OAuth, SOAP, REST and SOA design principles) Knowledge of Data management, ETL concepts, Enterprise software development, Object based and Relational Database technologies is an added advantage Must completed Salesforce Certifications (Administrator, Platform Builder 1, Platform Developer 2) Certification in Salesforce Service Cloud Consultant is an added advantage A strong knowledge on IT technologies supporting Customer Services and experience implementing them is preferred. An agile, inclusive and responsiblecultureis the foundation of ourcompanywhere diverse people are offered excellent opportunities to grow, learn and advanceintheir careers.We are committed toencouragingour employeesto reach their full potential,while valuing and respecting them as individuals.
Posted 1 month ago
6.0 - 7.0 years
9 - 10 Lacs
Gurugram
Work from Office
Seeking a data professional with advanced SQL skills, BI reporting (Power BI, Tableau, etc.), strong data modeling, ETL, and cloud exposure. Must thrive in Agile teams with strong problem-solving and communication abilities. .
Posted 1 month ago
8.0 - 13.0 years
20 - 35 Lacs
Mumbai
Remote
Interview Process: 1 Virtual Technical Round, 1 Face-to-Face Round (Mumbai), 1 HR Round ---------------------------------------------------------------------------------- Position: Associate Work Mode: 100% Remote Work Hours: Preferred overlap with New York, 6PM to 3AM IST Holidays: Follows US holidays Experience: Min 8 years Start Date: ASAP Key Technical Requirements: Python: Experience in web applications Python + sql with ETL background, Familiarity with Pandas Numpy, Differentiate where to use Python and SQL SQL : Performance optimization & best practices Data storage & retrieval strategies Writing stored procedures, functions Understanding of views, indexes, stored procedures and functions. Ability to optimize logic and analyze implemented solutions ETL & Data Warehousing: Strong understanding of data warehousing concepts • Ability to handle large datasets efficiently Other details: Individual Contributor role ability to work independently Ability to gather requirements and interact with stakeholders Team & Reporting Structure: Hired candidate will report to a US-based manager Sole developer for the role; part of a global team of 12 (including 3 in India) Interview Process: Round 1: Technical assessment Round 2: Advanced technical evaluation Key Responsibilities: Work with development teams and product managers to design and implement software solutions. Develop, test, and maintain backend code using Python. Develop and manage well-functioning databases and applications. Create and manage ETL processes to ensure efficient data integration and transformation. Develop and maintain APIs to facilitate communication between different systems. Collaborate with cross-functional teams to design and implement scalable backend solutions. Optimize and troubleshoot backend code to ensure high performance and reliability. Participate in code reviews and provide constructive feedback to peers. Stay updated with the latest industry trends and technologies to ensure best practices. Qualifications: Proven experience as a Backend Developer with expertise in Python. Strong proficiency in database development with SQL (NoSQL a plus). Experience with ETL processes and tools. Expertise in developing and maintaining APIs. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication skills and the ability to collaborate effectively with stakeholders. Preferred Qualifications: Experience with cloud platforms, preferably Azure. Databricks experience a plus. Knowledge of data warehousing and big data technologies. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience working with investment teams, particularly structured credit
Posted 1 month ago
8.0 - 12.0 years
18 - 25 Lacs
Bengaluru
Remote
Strong proficiency in SQL (MS SQL Server, or other RDBMS). o Experience with database indexing, partitioning, and query optimization. o Knowledge of database security best practices. Required Candidate profile o Familiarity with ETL processes, data warehousing, and reporting tools. o Experience with Azure cloud-based databases o Scripting knowledge (Python, Shell, PowerShell) is an advantage
Posted 1 month ago
5.0 - 10.0 years
10 - 11 Lacs
Pune, Chennai, Bengaluru
Hybrid
Hello Candidates We are Hiring!! Job Position - AbInitio Administrator Experience - 4+ years Location - Mumbai, Pune, Chennai, Bangalore Work Mode - Hybrid (3 days WFO ) Job Description: Strong knowledge of Ab Initio tools, including Co?Op, EME, GDE, and other components. • Understanding of data warehousing concepts, ETL processes, and data integration best practices • Manage all Ab Initio environments, including security, cluster setup, performance tuning, and continuous monitoring. • Perform cluster maintenance activities such as patching, Co?Op upgrades, EME backup and recovery, user provisioning, and automation of routine tasks. • Troubleshoot failed jobs and configure/maintain security policies effectively. • Knowledge of SQL for querying and manipulating data. NOTE - Candidates can share their resume - shrutia.talentsketchers@gmail.com
Posted 1 month ago
3.0 - 6.0 years
7 - 15 Lacs
Gurugram
Work from Office
Dear Candidate, Greetings!! Hiring For SSIS Developer - Gurgaon(wfo) Responsibilities 1 Must have exp into SSIS packages for ETL processes 2 End to end data migration 3 Must have exp in Oracle Cloud Share resume on abhishek@xinoe.com Regards,
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution Focus: Working across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution Delivery: Involvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Professional Expertise with SAP Configuration: Hands-on SAP configuration experience in SAP XXX (requisition to specify module/skill required) with a minimum of XX end-to-end implementations from project preparation to go-live as Consultant or Solution Architect. SAP Process knowledge: Experience with XXX processes and workflows (requisition to specify functional area e.g., finance, supply chain, etc.). Experience with SAP S/4HANA: Practical experience with SAP S/4HANA, demonstrating how it is applied in different client environments. SAP Certifications: Holder of SAP certifications. Preferred technical and professional experience Proven experience as an SAP Embedded Analytics & SAP DATASPHERE Consultant with a focus on implementation and optimization. In-depth knowledge of SAP DATASPHERE data modelling, ETL processes, and reporting tools. In-depth knowledge of SAP Embedded Analytics modelling for data extraction from S/4 system. Strong analytical and problem-solving skills. Effective communication and collaboration skills
Posted 1 month ago
4.0 - 9.0 years
15 - 20 Lacs
Bengaluru
Hybrid
Responsibilities and Key Competencies Develop and maintain dashboards and visual reports using Power BI (or other relevant reporting tools) Write efficient SQL queries to extract and transform data Collaborate with business analysts to deliver on business reporting requirements Perform data validation, quality assurance, and troubleshooting to ensure data integrity Monitor performance of BI solutions and optimize as needed Work closely with data engineering to maintain and enhance data models and pipelines Provide ad hoc analysis and insights to support business initiatives Train end-users on self-service BI tools and promote data-driven decision-making Key Competencies Attention to detail and data accuracy Business acumen and curiosity Team collaboration and cross-functional communication Initiative and ability to manage multiple priorities. Skills & Experience Required 4+ years of experience in a data or reporting analyst role Proficiency in Power BI Strong SQL skills, with proven experience querying large / complex datasets Experience working with data visualization best practices Understanding of data modelling and ETL/ELT processes Strong analytical and problem-solving abilities. Excellent communication and stakeholder management skills Experience with version control (e.g., Git) and agile methodologies a plus Experience querying SQL Server databases, Snowflake desirable Experience working with SSAS Tabular Models Familiarity with data governance and security best practices
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France