Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 - 4.0 years
9 - 13 Lacs
Pune
Work from Office
Project description You will be working in a global team that manages and performs a global technical control. You'll be joining Assets Management team which is looking after asset management data foundation and operates a set of in-house developed tooling. As an IT engineer you'll play an important role in ensuring the development methodology is followed, and lead technical design discussions with the architects. Our culture centers around partnership with our businesses, transparency, accountability and empowerment, and passion for the future. Responsibilities Design, develop, and maintain scalable data solutions using Starburst. Collaborate with cross-functional teams to integrate Starburst with existing data sources and tools. Optimize query performance and ensure data security and compliance. Implement monitoring and alerting systems for data platform health. Stay updated with the latest developments in data engineering and analytics. Skills Must have Bachelor's degree or Masters in a related technical field; or equivalent related professional experience. Prior experience as a Software Engineer applying new engineering principles to improve existing systems including leading complex, well defined projects. Strong knowledge of Big-Data Languages including: SQL Hive Spark/Pyspark Presto Python Strong knowledge of Big-Data Platforms, such as:o The Apache Hadoop ecosystemo AWS EMRo Qubole or Trino/Starburst Good knowledge and experience in cloud platforms such as AWS, GCP, or Azure. Continuous learner with the ability to apply previous experience and knowledge to quickly master new technologies. Demonstrates the ability to select among technology available to implement and solve for need. Able to understand and design moderately complex systems. Understanding of testing and monitoring tools. Ability to test, debug, fix issues within established SLAs. Experience with data visualization tools (e.g., Tableau, Power BI). Understanding of data governance and compliance standards. Nice to have Data Architecture & EngineeringDesign and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data VisualizationCreate insightful Power BI dashboards to help drive business decisions. Other Languages EnglishC1 Advanced Seniority Senior
Posted 4 days ago
3.0 - 8.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. This role will also have L3 responsibilities for ETL processes Responsibilities Delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Contribute to solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Possess strong problem-solving abilities with a focus on managing to business outcomes through collaboration with multiple internal and external parties Enthusiastic, willing, able to learn and continuously develop skills and techniques enjoys change and seeks continuous improvement A clear communicator both written and verbal with good presentational skills, fluent and proficient in the English language Customer focused and a team player Qualifications Experience Bachelors degree in Computer Science, MIS, Business Management, or related field 3+ years experience in Information Technology 1+ years experience in Azure Data Lake Technical Skills Proven experience development activities in Data, BI or Analytics projects Solutions Delivery experience - knowledge of system development lifecycle, integration, and sustainability Strong knowledge of Teradata architecture and SQL Good knowledge of Azure data factory or Databricks Knowledge of Presto / Denodo / Infoworks is desirable Knowledge of data warehousing concepts and data catalog tools (Alation)
Posted 6 days ago
5.0 - 10.0 years
19 - 25 Lacs
Hyderabad
Work from Office
Overview Primary focus would be to perform development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. This role will also have L3 responsibilities for ETL processes Responsibilities Delivery of key Azure Data Lake projects within time and budget Contribute to solution design and build to ensure scalability, performance and reuse of data and other components Delivery of key Azure Data Lake projects within time and budget Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Possess strong problem-solving abilities with a focus on managing to business outcomes through collaboration with multiple internal and external parties Enthusiastic, willing, able to learn and continuously develop skills and techniques enjoys change and seeks continuous improvement A clear communicator both written and verbal with good presentational skills, fluent and proficient in the English language Customer focused and a team player Qualifications Bachelors degree in Computer Science, MIS, Business Management, or related field 5+ years experience in Information Technology 4+ years experience in Azure Data Lake Bachelors degree in Computer Science, MIS, Business Management, or related field Technical Skills Proven experience development activities in Data, BI or Analytics projects Solutions Delivery experience - knowledge of system development lifecycle, integration, and sustainability Strong knowledge of Pyspark and SQL Good knowledge of Azure data factory or Databricks Knowledge of Presto / Denodo is desirable Knowledge of FMCG business processes is desirable Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation
Posted 6 days ago
3.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Skills: Microsoft Azure, Hadoop, Spark, Databricks, Airflow, Kafka, Py spark RequirmentsExperience working with distributed technology tools for developing Batch and Streaming pipelines using. SQL, Spark, Python Airflow Scala Kafka Experience in Cloud Computing, e.g., AWS, GCP, Azure, etc. Able to quickly pick up new programming languages, technologies, and frameworks. Strong skills building positive relationships across Product and Engineering. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc. Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture Experience working with Data platforms, including EMR, Airflow, Data bricks (Data Engineering & Delta Lake components) Experience working in Agile and Scrum development process. Experience in EMR/ EC2, Data bricks etc. Experience working with Data warehousing tools, including SQL database, Presto, and Snowflake Experience architecting data product in Streaming, Server less and Microservices Architecture and platform.
Posted 1 week ago
3.0 - 6.0 years
10 - 15 Lacs
Gurugram, Bengaluru
Work from Office
3+ years of experience in data science roles, working with tabular data in large-scale projects. Experience in feature engineering and working with methods such as XGBoost, LightGBM, factorization machines , and similar algorithms. Experience in adtech or fintech industries is a plus. Familiarity with clickstream data, predictive modeling for user engagement, or bidding optimization is highly advantageous. MS or PhD in mathematics, computer science, physics, statistics, electrical engineering, or a related field. Proficiency in Python (3.9+), with experience in scientific computing and machine learning tools (e.g., NumPy, Pandas, SciPy, scikit-learn, matplotlib, etc.). Familiarity with deep learning frameworks (such as TensorFlow or PyTorch) is a plus. Strong expertise in applied statistical methods, A/B testing frameworks, advanced experiment design, and interpreting complex experimental results. Experience querying and processing data using SQL and working with distributed data storage solutions (e.g., AWS Redshift, Snowflake, BigQuery, Athena, Presto, MinIO, etc.). Experience in budget allocation optimization, lookalike modeling, LTV prediction, or churn analysis is a plus. Ability to manage multiple projects, prioritize tasks effectively, and maintain a structured approach to complex problem-solving. Excellent communication and collaboration skills to work effectively with both technical and business teams.
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Overview The individual will spend time building and maintaining our in-house planogram platform and leverage analytical and critical reasoning to solve complex, multidimensional problems using quantitative information and applying statistical and machine learning techniques. The C# .Net Developer will work with team members to develop the software that will implement our product assortment and placement onto PepsiCos planogram platform. Responsibilities Expand and maintain in-house planogram/reporting platform built using C# .NET framework Work with team lead on enhancing platform Optimize shelf assortment across multiple categories that satisfy days of supply, blocking and flow constraints Expand platform to new categories Apply machine learning techniques into assortment optimization and product placement Enhance and maintain platform UI Qualifications B.S./M.S. in quantitative discipline required (e.g. computer science, mathematics, operations research, engineering.) 7+ years of coding experience in C#, specifically using the .net framework 3+ years of coding experience in Angular or any equivalent js framework Strong skills in C# using ASP.NET and .net Core frameworks, LINQ and Entity Framework Experience with using project management tools such as DevOps Ability to support/develop windows and web platform simultaneously. High-level querying skills using SQL languages such asSQL or Presto. Knowledge using window functions, joins and sub-queries SQL SERVER or any RDBMS. Experience with Azure, .Net Core, Visual Studio, SQL Server
Posted 1 week ago
11.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Work from Office
Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. Role will lead key data lake projects and resources, including innovation related initiatives (e.g. adoption of technologies like Databricks, Presto, Denodo, Python,Azure data factory; database encryption; enabling rapid experimentation etc.)). This role will also have L3 and release management responsibilities for ETL processes Responsibilities Lead delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Drive solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Manage work intake, prioritization and release timing; balancing demand and available resources. Ensure tactical initiatives are aligned with the strategic vision and business needs Oversee coordination and partnerships with Business Relationship Managers, Architecture and IT services teams to develop and maintain EDW and data lake best practices and standards along with appropriate quality assurance policies and procedures May lead a team of employee and contract resources to meet build requirements: o Set priorities for the team to ensure task completion o Coordinate work activities with other IT services and business teams. o Hold team accountable for milestone deliverables o Provide L3 support for existing applications o Release management Qualifications Bachelors degree in Computer Science, MIS, Business Management, or related field 11 + years experience in Information Technology or Business Relationship Management 7 + years experience in Data Warehouse/Azure Data Lake 3 years experience in Azure data lake 2 years experience in project management Technical Skills Thorough knowledge of data warehousing / data lake concepts Hands on experience on tools like Azure data factory, databricks, pyspark and other data management tools on Azure Proven experience in managing Data, BI or Analytics projects Solutions Delivery experience - expertise in system development lifecycle, integration, and sustainability Experience in data modeling or database experience; Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Experience dealing with and managing multiple vendors Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Ability to budget resources and funding to meet project deliverables
Posted 1 week ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about Target in India At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. About the Role As a Senior RBX Data Specialist at Target in India, involves the end-to-end management of data, encompassing building and maintaining pipelines through ETL/ELT and data modeling, ensuring data accuracy and system performance, and resolving data flow issues. It also requires analyzing data to generate insights, creating visualizations for stakeholders, automating processes for efficiency, and effective collaboration across both business and technical teams. You will also answer ad-hoc questions from your business users by conducting quick analysis on relevant data, identify trends and correlations, and form hypotheses to explain the observations. Some of this will lead to bigger projects of increased complexity, where you will have to work as a part of a bigger team, but also independently execute specific tasks. Finally, you are expected to always adhere to project schedule and technical rigor as well as requirements for documentation, code versioning, etc Key Responsibilities Data Pipeline and MaintenanceMonitor data pipelines and warehousing systems to ensure optimal health and performance. Ensure data integrity and accuracy throughout the data lifecycle. Incident Management and ResolutionDrive the resolution of data incidents and document their causes and fixes, collaborating with teams to prevent recurrence. Automation and Process ImprovementIdentify and implement automation opportunities and Data Ops best practices to enhance the efficiency, reliability, and scalability of data processes. Collaboration and CommunicationWork closely with data teams and stakeholders, to understand data pipeline architecture and dependencies, ensuring timely and accurate data delivery while effectively communicating data issues and participating in relevant discussions. Data Quality and GovernanceImplement and enforce data quality standards, monitor metrics for improvement, and support data governance by ensuring policy compliance. Documentation and ReportingCreate and maintain clear and concise documentation of data pipelines, processes, and troubleshooting steps. Develop and generate reports on data operations performance and key metrics. Core responsibilities are described within this job description. Job duties may change at any time due to business needs. About You B.Tech / B.E. or equivalent (completed) degree 5+ years of relevant work experience Experience in Marketing/Customer/Loyalty/Retail analytics is preferable Exposure to A/B testing Familiarity with big data technologies, data languages and visualization tools Exposure to languages such as Python and R for data analysis and modelling Proficiency in SQL for data extraction, manipulation, and analysis, with experience in big data query frameworks such as Hive, Presto, SQL, or BigQuery Solid foundation knowledge in mathematics, statistics, and predictive modelling techniques, including Linear Regression, Logistic Regression, time-series models, and classification techniques. Ability to simplify complex technical and analytical methodologies for easier comprehension for broad audiences. Ability to identify process and tool improvements and implement change Excellent written and verbal English communication skills for Global working Motivation to initiate, build and maintain global partnerships Ability to function in group and/or individual settings. Willing and able to work from our office location (Bangalore HQ) as required by business needs and brand initiatives Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging
Posted 1 week ago
7.0 - 9.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2162_JOB Date Opened 15/03/2024 Industry Technology Job Type Work Experience 7-9 years Job Title Sr Data Engineer City Bangalore Province Karnataka Country India Postal Code 560004 Number of Positions 5 Mandatory Skills: Microsoft Azure, Hadoop, Spark, Databricks, Airflow, Kafka, Py spark RequirmentsExperience working with distributed technology tools for developing Batch and Streaming pipelines using. SQL, Spark, Python Airflow Scala Kafka Experience in Cloud Computing, e.g., AWS, GCP, Azure, etc. Able to quickly pick up new programming languages, technologies, and frameworks. Strong skills building positive relationships across Product and Engineering. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc. Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture Experience working with Data platforms, including EMR, Airflow, Data bricks (Data Engineering & Delta Lake components) Experience working in Agile and Scrum development process. Experience in EMR/ EC2, Data bricks etc. Experience working with Data warehousing tools, including SQL database, Presto, and Snowflake Experience architecting data product in Streaming, Server less and Microservices Architecture and platform. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
8.0 - 13.0 years
40 - 65 Lacs
Bengaluru
Work from Office
About the team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. We’ve done this – with zero downtime! Sounds impossible? Well, that’s the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection, and see failures as opportunities to become better. We’ve taken steps to inculcate a strong ‘Founder’s Mindset’ across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As Engineering Manager, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we aren’t building unparalleled tech solutions, you can find us debating the plot points of our favourite books and games – or even gossipping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled. You will analyse other employees’ tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects What you will do Design tasks for other engineers, keeping Meesho’s guidelines and standards in mind Keep a close look on various projects and monitor the progress Drive excellence in quality across the organisation and solutioning of product problems Collaborate with the sales and design teams to create new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's / Master’s in computer science At least 8+ years of professional experience At least 4+ years’ experience in managing software development teams Experience in building large-scale distributed Systems Experience in Scalable platforms Expertise in Java/Python/Go-Lang and multithreading Good understanding on Spark and internals Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems – Kafka Good experience on cloud infrastructure - AWS preferably Ability to drive sprints and OKRs with good stakeholder management experience. Exceptional team managing skills Experience in managing a team of 4-5 junior engineers Good understanding on Streaming and real time pipelines Good understanding on Data modelling concepts, Data Quality tools Good knowledge in Business Intelligence tools Metabase, Superset, Tableau etc. Good to have knowledge - Trino, Flink, Presto, Druid, Pinot etc. Good to have knowledge - Data pipeline building
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for? Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
WHAT YOU WILL WORK ON Serve as a liaison between product, engineering & data consumers by analyzing the data, finding gaps and help drive roadmap Support and troubleshoot issues (data & process), identify root cause, and proactively recommend sustainable corrective actions by collaborating with engineering/product teams Communicate actionable insights using data, often for the stakeholders and non-technical audience. Ability to write technical specifications describing requirements for data movement, transformation & quality checks WHAT YOU BRING Bachelor s Degree in Computer Science, MIS, other quantitative disciplines, or related fields 3-7 years of relevant analytical experiences that can translate into defining strategic vision into requirements and working with the best engineers, product managers, and data scientists Ability to conduct data analysis, develop and test hypothesis and deliver insights with minimal supervision Experience identifying and defining KPI s using data for business areas such as Sales, Consumer Behaviour, Supply Chain etc. Exceptional SQL skills Experience with modern visualization tool stack, such as: Tableau, Power BI, Domo etc. Knowledge of open-source, big data and cloud infrastructure such as AWS, Hive, Snowflake, Presto etc. Incredible attention to detail, with structured problem-solving approach Excellent communications skills (written and verbal) Experience with agile development methodologies Experience with retail or ecommerce domains is a plus.
Posted 1 week ago
5.0 - 10.0 years
13 - 22 Lacs
Bengaluru
Work from Office
Job Opportunity: Senior Data Analyst Bangalore Location: Bangalore, India Company: GSPANN Technologies Apply: Send your resume to heena.ruchwani@gspann.com GSPANN is hiring a Senior Data Analyst with 57 years of experience to join our dynamic team in Bangalore! What Were Looking For: Education: Bachelor’s degree in Computer Science, MIS, or a related field Experience: 5–7 years in data analysis, with a strong ability to translate business strategy into actionable insight Advanced SQL expertise Proficiency in Tableau , Power BI , or Domo Experience with AWS , Hive , Snowflake , Presto Ability to define and track KPIs across domains like Sales, Consumer Behavior, and Supply Chain Strong problem-solving skills and attention to detail Excellent communication and collaboration abilities Experience working in Agile environments Retail or eCommerce domain experience is a plus If this sounds like the right fit for you, don’t wait— send your updated resume to heena.ruchwani@gspann.com today!
Posted 1 week ago
3.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Your Role Experience in data engineering and end-to-end implementation of CDP projects. Proficient in SQL, CDP (TreasureData), Python/Dig-Dag, Presto/SQL, and data engineering. Hands-on experience with Treasure Data CDP implementation and management. Excellent SQL skills, including advanced query writing and optimization. Oversee the end-to-end maintenance and operation of the Treasure Data CDP. Familiarity with data integration, API operations, and audience segmentation. Your profile Experience in unifying data across multiple brands and regions, ensuring consistency and accuracy. Ability to create and manage data workflows in Treasure Data Collaborate with cross-functional teams to ensure successful data integration and usage. Troubleshoot and optimize data pipelines and processes for scalability and performance. Stay updated on the latest features and best practices in Treasure Data and related technologies.
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Starburst Enterprise Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationSkill"starburst terraformazure aws iam , aws vpc , aws s3 , aws glue , aws athena , aws eks kubernetes , aws elb, route 53, aws ec2 Relational dbs - redshift , postgres-sqlaws dynamo dbKubernetespythonAws api gateway "2.Roles and responsibilities:"1.Proven experience as a Starburst Administrator in an AWS environment.a.Cluster Management & Administrationb.Performance Tuningc.Security & Complianced.Data Integration & Connectivitye.Upgrades & Maintenance2.Strong proficiency in managing and optimizing Starburst clusters.3.Experience with Presto and related big data technologies.4.Solid understanding of AWS services and cloud architecture.5.Proficiency in scripting languages (e.g., Python, Bash) for automation.6.Familiarity with version control systems (e.g., Git).7.Strong problem-solving and troubleshooting skills.8.Excellent communication and collaboration abilities.9.AWS Certified Big Data Specialty certification is a plus"3. Professional Attributes Excellent communication, Collaboration and Analytical Skills4.Shift Timing Shift B Qualification 15 years full time education
Posted 2 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
Skill & Experience Mandatory Skills: Azure Cloud Platform, Azure services including advanced knowledge of Azure Databricks, Hive Metastore, and Presto, Azure Kubernetes Service (AKS) Terraform, Ansible, Azure DevOps tools, Azure Databricks, GitHub Strong experience with Azure services, including advanced knowledge of Azure Databricks, Hive Metastore, and Presto. Basic understanding of Azure Kubernetes Service (AKS) is required. Proficiency in Terraform and Ansible for infrastructure management is a plus. 4+ years of experience in cloud technologies, including at least 3 years working with Azure Databricks and data management technologies. Demonstrated experience in managing and optimizing large-scale data processing environments. Excellent problem-solving skills with the ability to optimize and manage complex cloud environments. Proficiency in using Azure DevOps for CI/CD pipelines and GitHub for version control. Microsoft Certified: Azure Solutions Architect Expert (AZ-305) or equivalent advanced certification is preferred. Additional certifications in Azure Databricks or related data technologies are a plus. Azure Databricks Management Hive Metastore Management Presto Operations Automation & CI/CD Optimization & Performance Security & Compliance Documentation & Handoverto-date handover Knowledge Sharingtheir Stakeholder Communication Continuous Improvement
Posted 2 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS ,Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL,Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance)
Posted 2 weeks ago
5.0 - 10.0 years
8 - 13 Lacs
Gurugram
Work from Office
Job Summary As a Data Engineer at Synechron, you will play a pivotal role in harnessing data to drive business value. Your expertise will be essential in developing and maintaining data pipelines, ensuring data integrity, and facilitating analytics that inform strategic decisions. This role contributes significantly to our business objectives by optimizing data processing and enabling insightful reporting across the organization. Software Requirements Required: AWS Redshift (3+ years of experience) Spark (3+ years of experience) Python (3+ years of experience) Complex SQL (3+ years of experience) Shell scripting (2+ years of experience) Docker (2+ years of experience) Kubernetes (2+ years of experience) Bitbucket (2+ years of experience) Preferred: DBT Dataiku Kubernetes cluster management Overall Responsibilities Develop and optimize data pipelines using big data technologies, ensuring seamless data flow and accessibility. Collaborate with cross-functional teams to translate business requirements into technical solutions. Ensure high data quality and integrity in analytics and reporting processes. Implement data architecture and modeling best practices to support strategic objectives. Troubleshoot and resolve data-related issues, maintaining a service-first mentality to enhance customer satisfaction. Technical Skills (By Category) Programming Languages: EssentialPython, SQL PreferredShell scripting Databases/Data Management: EssentialAWS Redshift, Hive, Presto PreferredDBT Cloud Technologies: EssentialAWS PreferredKubernetes, Docker Frameworks and Libraries: EssentialSpark PreferredDataiku Development Tools and Methodologies: EssentialBitbucket, Airflow or Argo Workflows Experience Requirements 6-7 years of experience in data engineering or related roles. Strong understanding of data & analytics concepts, with proven experience in big data technologies. Experience in the financial services industry preferred but not required. Alternative pathwaysSignificant project experience in data architecture and analytics. Day-to-Day Activities Design and implement scalable data pipelines. Participate in regular team meetings to align on project goals and deliverables. Collaborate with stakeholders to refine data processes and analytics. Make informed decisions on data management strategies and technologies. Qualifications Bachelors degree in Computer Science, Data Engineering, or a related field (or equivalent experience). Certifications in AWS or relevant data engineering technologies preferred. Commitment to continuous professional development in data engineering and analytics. Professional Competencies Strong critical thinking and problem-solving capabilities, with a focus on innovation. Effective communication skills and stakeholder management. Ability to work collaboratively in a team-oriented environment. Adaptability and a willingness to learn new technologies and methodologies. Excellent time and priority management to meet deadlines and project goals.
Posted 2 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Navi Mumbai
Work from Office
Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS , Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL, Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance
Posted 3 weeks ago
5.0 - 10.0 years
9 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Title: Big Data Administrator Location : Hyderabad (Weekly once WFO) Experience : 5+ Years Department : Data Engineering / IT Infrastructure Job Summary: We are seeking a Big Data Administrator with strong expertise in Linux systems, AWS infrastructure, and Big Data technologies. This role is ideal for someone experienced in managing large-scale Hadoop ecosystems in production, with a deep understanding of observability, performance tuning, and automation using tools like Terraform or Ansible. Key Responsibilities: Manage and maintain large-scale Big Data clusters (Cloudera, Hortonworks, or AWS EMR) Develop and support infrastructure as code using Terraform or Ansible Administer Hadoop ecosystem components HDFS, YARN, Hive (Tez, LLAP), Presto, Spark Implement and monitor observability tools like Prometheus, InfluxDB, Dynatrace, Grafana, Splunk Optimize SQL performance on Hive/Spark and understand query plans Automate cluster operations using Python (PySpark) or Shell scripting Support Data Analysts & Scientists with tools like JupyterHub, R-Studio, H2O, SAS Handle data in various formats ORC, Parquet, Avro Integrate with and support Kubernetes-based environments (if applicable) Collaborate across teams for deployments, monitoring, and troubleshooting Must-Have Skills: 5+ years in Linux system administration and AWS cloud infrastructure Experience with Cloudera, Hortonworks, or EMR in production Strong in Terraform / Ansible for automation Solid hands-on with HDFS, YARN, Hive, Spark, Presto Proficient in Python and Shell scripting Familiar with observability tools : Grafana, Prometheus, InfluxDB, Splunk, Dynatrace Familiarity with Active Directory , Windows VDI platforms (Citrix, AWS Workspaces) Nice-to-Have Skills: Experience with Airflow , Oozie Familiar with Pandas, Numpy, Scipy, PyTorch Prior use of Jenkins, Chef, Packer Comfortable reading code in Java, Scala, Python, R Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field Strong communication, collaboration, and troubleshooting skills Ability to thrive in remote or hybrid work environments Please send your updated resume at komalikab@upwardiq.com
Posted 3 weeks ago
4 - 6 years
30 - 34 Lacs
Bengaluru
Work from Office
Overview Annalect is seeking a hands-on Data QA Manager to lead and elevate data quality assurance practices across our growing suite of software and data products. This is a technical leadership role embedded within our Technology teams, focused on establishing best-in-class data quality processes that enable trusted, scalable, and high-performance data solutions. As a Data QA Manager, you will drive the design, implementation, and continuous improvement of end-to-end data quality frameworks, with a strong focus on automation, validation, and governance. You will work closely with data engineering, product, and analytics teams to ensure data integrity, accuracy, and compliance across complex data pipelines, platforms, and architectures, including Data Mesh and modern cloud-based ecosystems. This role requires deep technical expertise in SQL, Python, data testing frameworks like Great Expectations, data orchestration tools (Airbyte, DbT, Trino, Starburst), and cloud platforms (AWS, Azure, GCP). You will lead a team of Data QA Engineers while remaining actively involved in solution design, tool selection, and hands-on QA execution. Responsibilities Key Responsibilities: Develop and implement a comprehensive data quality strategy aligned with organizational goals and product development initiatives. Define and enforce data quality standards, frameworks, and best practices, including data validation, profiling, cleansing, and monitoring processes. Establish data quality checks and automated controls to ensure the accuracy, completeness, consistency, and timeliness of data across systems. Collaborate with Data Engineering, Product, and other teams to design and implement scalable data quality solutions integrated within data pipelines and platforms. Define and track key performance indicators (KPIs) to measure data quality and effectiveness of QA processes, enabling actionable insights for continuous improvement. Generate and communicate regular reports on data quality metrics, issues, and trends to stakeholders, highlighting opportunities for improvement and mitigation plans. Maintain comprehensive documentation of data quality processes, procedures, standards, issues, resolutions, and improvements to support organizational knowledge-sharing. Provide training and guidance to cross-functional teams on data quality best practices, fostering a strong data quality mindset across the organization. Lead, mentor, and develop a team of Data QA Analysts/Engineers, promoting a high-performance, collaborative, and innovative culture. Provide thought leadership and subject matter expertise on data quality, influencing technical and business stakeholders toward quality-focused solutions. Continuously evaluate and adopt emerging tools, technologies, and methodologies to advance data quality assurance capabilities and automation. Stay current with industry trends, innovations, and evolving best practices in data quality, data engineering, and analytics to ensure cutting-edge solutions. Qualifications Required Skills 11+ years of hands-on experience in Data Quality Assurance, Data Test Automation, Data Comparison, and Validation across large-scale datasets and platforms. Strong proficiency in SQL for complex data querying, data validation, and data quality investigations across relational and distributed databases. Deep knowledge of data structures, relational and non-relational databases, stored procedures, packages, functions, and advanced data manipulation techniques. Practical experience with leading data quality tools such as Great Expectations, DbT tests, and data profiling and monitoring solutions. Experience with data mesh and distributed data architecture principles for enabling decentralized data quality frameworks. Hands-on experience with modern query engines and data platforms, including Trino/Presto, Starburst, and Snowflake. Experience working with data integration and ETL/ELT tools such as Airbyte, AWS Glue, and DbT for managing and validating data pipelines. Strong working knowledge of Python and related data libraries (e.g., Pandas, NumPy, SQLAlchemy) for building data quality tests and automation scripts.
Posted 1 month ago
5 - 10 years
15 - 20 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Job description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Senior Data Analyst Experience: 5+ Years Skill Set: Data Analysis, SQL and Cloud (AWS, Azure, GCP) Location: Pune, Hyderabad, Gurgaon Key Requirements: Bachelors degree in Computer Science, MIS, or related fields. 6-7 years of relevant analytical experience, translating strategic vision into actionable requirements. Ability to conduct data analysis, develop and test hypotheses, and deliver insights with minimal supervision. Experience identifying and defining KPIs for business areas such as Sales, Consumer Behavior, Supply Chain, etc. Exceptional SQL skills. Experience with modern visualization tools like Tableau, Power BI, Domo, etc. Knowledge of open-source, big data, and cloud infrastructure such as AWS, Hive, Snowflake, Presto, etc. Incredible attention to detail with a structured problem-solving approach. Excellent communication skills (written and verbal). Experience with agile development methodologies. Experience in retail or e-commerce domains is a plus. How to Apply: Interested candidates can share their CV at pragati.jha@gspann.com.
Posted 1 month ago
7 - 12 years
50 - 75 Lacs
Bengaluru
Work from Office
---- What the Candidate Will Do ---- Partner with engineers, analysts, and product managers to define technical solutions that support business goals Contribute to the architecture and implementation of distributed data systems and platforms Identify inefficiencies in data processing and proactively drive improvements in performance, reliability, and cost Serve as a thought leader and mentor in data engineering best practices across the organization ---- Basic Qualifications ---- 7+ years of hands-on experience in software engineering with a focus on data engineering Proficiency in at least one programming language such as Python, Java, or Scala Strong SQL skills and experience with large-scale data processing frameworks (e.g., Apache Spark, Flink, MapReduce, Presto) Demonstrated experience designing, implementing, and operating scalable ETL pipelines and data platforms Proven ability to work collaboratively across teams and communicate technical concepts to diverse stakeholders ---- Preferred Qualifications ---- Deep understanding of data warehousing concepts and data modeling best practices Hands-on experience with Hadoop ecosystem tools (e.g., Hive, HDFS, Oozie, Airflow, Spark, Presto) Familiarity with streaming technologies such as Kafka or Samza Expertise in performance optimization, query tuning, and resource-efficient data processing Strong problem-solving skills and a track record of owning systems from design to production
Posted 1 month ago
4 - 7 years
19 - 25 Lacs
Hyderabad
Work from Office
Overview Primary focus would be to perform development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. This role will also have L3 responsibilities for ETL processes Responsibilities Delivery of key Azure Data Lake projects within time and budget Contribute to solution design and build to ensure scalability, performance and reuse of data and other components Delivery of key Azure Data Lake projects within time and budget Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Possess strong problem-solving abilities with a focus on managing to business outcomes through collaboration with multiple internal and external parties Enthusiastic, willing, able to learn and continuously develop skills and techniques enjoys change and seeks continuous improvement A clear communicator both written and verbal with good presentational skills, fluent and proficient in the English language Customer focused and a team player Qualifications Bachelors degree in Computer Science, MIS, Business Management, or related field 5+ years experience in Information Technology 4+ years experience in Azure Data Lake Bachelors degree in Computer Science, MIS, Business Management, or related field Technical Skills Proven experience development activities in Data, BI or Analytics projects Solutions Delivery experience - knowledge of system development lifecycle, integration, and sustainability Strong knowledge of Pyspark and SQL Good knowledge of Azure data factory or Databricks Knowledge of Presto / Denodo is desirable Knowledge of FMCG business processes is desirable Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation
Posted 1 month ago
3 - 7 years
5 - 9 Lacs
Chandigarh
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Roles and responsibilities: Understand the business model on why things are the way they are, ask relevant questions and get them clarified. Breakdown complex problems in small solvable components, to be able to identify problem areas in each component. Conduct cost/benefit analysis and feasibility studies for proposed projects to aid in decision-making. Facilitate the implementation of new or improved business processes and systems. Coordinate with business stakeholders to identify gaps in data, processes and suggest process improvements. Understand and follow the project roadmap, plan data availability and coordinate with the execution team to ensure a successful execution of projects. Prescribe suitable solutions with an understanding in limitations of toolsets and available data. Manage procurement of data from various sources and perform data audits. Fetch and analyze data from disparate sources and drive meaningful insights. Provide recommendations on the business rules for effective campaign targeting. Interpret analytical results and provide insights; present key findings and recommended next steps to clients. Develop tangible analytical projects; communicate project details to clients and internal delivery team via written documents and presentations, in forms of specifications, diagrams, and data/process models. Audit deliverables ensuring accuracy by critically examining the data and reports against requirements. Collaborate on regional/global analytic initiatives and localize inputs for country campaign practices. Actively work on audience targeting insights, optimize campaigns and improve comm governance. Technical and Functional Skills: Must Have BS/BA degree or equivalent professional experience required Degree. Minimum 7+ years of professional experience in advanced analytics for a Fortune 500-scale company or a prominent consulting organization. Experience inData Extraction tools, Advanced Excel, CRM Analytics, Campaign Marketing, and Analytics knowledge- Campaign Analytics. Strong in numerical and analytical skills. Strong in Advanced Excel (prior experience with Google sheets is an added plus) Strong analytical and storytelling skills; ability to derive relevant insights from large reports and piles of disparate data. Comfortable working autonomously with broad guidelines. Passion for data andanalytics for marketing and eagerness to learn. Excellent communications skills, both written and spoken; ability to explain complex technical concepts in plain English. Ability to manage multiple priorities and projects, aligning teams to project timelines and ensuring quality of deliverables. Work with business teams to identify business use cases and develop solutions to meet these needs using analytical approaches. Manage regular reporting and ad-hoc data extract from other departments. Knowledge on analyzing digital campaigns and the tools/technologies of performance marketing. Experience with Google sheet/Excel. Good To Have Hands-on experience in digital marketing and/or 1:1 marketing in any channel; expert level knowledge in database marketing and CRM. Working knowledge in data visualization tools (Tableau, QlikView, etc.). Working knowledge of analytical/statistical techniques. Experience in Hadoop environment- Hive, Presto is a plus. Experience in Python/R. Previous consulting experience is a definite plus.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2