Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
0 Lacs
, India
On-site
About Us Preferred job location: Mumbai, India When you work with us, youll find that we deliver results; without compromising on respect. We value each others differences while recognising individual strength. We are the worlds leading contract logistics company. We create competitive advantage for our customers through customized warehousing and transportation services. We combine our global scale with local knowledge and sector expertise. At DHL Supply Chain (DSC), there&aposs more to a role than the work we do. Whatever your role is, we never forget that you make us who we are. We work hard to make sure a career with DHL is as satisfying and successful as it can be. Join a supportive work environment where youll have the tools and training you need to grow and succeed. DHL Supply Chain is Great Place To Work certified. Responsibilities The Regional Warehousing IT Consultant is responsible for leading WMS data solutions and supporting MAWM activities. The role involves designing, delivering and supporting strategic data solutions and structured analysis of business data requirements within the WMS ecosystem strategy in the APAC region. Additionally, improving and supporting the Manhattan Active WMS standardization and serve as a key contact for WMS related queries and issues. Overall, the role of a Regional Warehousing IT Solution Consultant is multifaceted, requiring a blend of technical knowledge, project management skills, and the ability to collaborate with various stakeholders. It&aposs a critical role in ensuring that WMS solutions are effectively implemented and aligned with the region&aposs strategic goals, ultimately enhancing operational efficiency and data management within the IT warehousing environment. Collaborate closely with the IT Solution Lead, APAC WD Director, and APAC WD VP to ensure the successful implementation of the strategic roadmap for the WMS solution in the APAC region. Data solution Develop and implement data solutions under warehousing domain like standard reporting policy, archiving, data use, real time operational dashboards, key red flags etc. that address the specific business requirements within the WMS ecosystem in the APAC region. Create and optimize the common data source (e.g. DB views) as baseline for shared usage, with alignment with APAC data infrastructure strategy. Play a key role in proposing appropriate data solutions from approved data tech stacks and available data products in APAC MAWM Data Ecosystem that fits to specific business use case requirements. Lead and manage UAT activities between stakeholders and delivery partners to assure data solutions are validated appropriately corresponds to business requirements before release into production. MAWM Related Activities Issue management and coordination between APAC countries leads, Center of Excellence (CoE), and the vendor. Day to day support of Regional Warehousing IT Solution Lead. Pioneering new MAWM functionalities that can improve operational efficiency while maximizing the WMS capabilities. Testing and supporting regional standard solution. Multi country solution design Stakeholder collaboration and communication Provide regular communications among stakeholders as well as internal teams to keep everyone updated on progress and address any issues that arise. Provide support to DHL DSC APAC data analytics team during data mapping phase and data validation for warehousing data product build/enhancements. Coordinate data standardization approach across APAC IT Business Unit Requirements Minimum 5 years of working experience around WMS and data related solutions (reporting, data analytics, and visualization in the supply chain industry) Data related certifications would be a plus. Experience/certificate for BI tools - Power BI, Qlik Sense Experience with SQL / NoSQL databases - Oracle, Ms SQL, MySQL Experience with Databricks SQL and Snowflake preferred MHE solution experience preferred Strong experience in the following data disciplines: data management, data governance, data analytics, and data visualization. Strong problem-solving and continuous improvement mindset to overcome challenges. Good interpersonal and communication skills. Ability to drive regional data strategy Clear understanding how to interpret data and visualize result for warehouse floor users as well as all management levels Understanding relationship of DB data/records against its business usage Job application will open till 31 July 2025. Show more Show less
Posted 19 hours ago
9.0 - 11.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description Qualifications: Overall 9+ years of IT experience Minimum of 5+ years' preferred managing Data Lakehouse environments, Azure Databricks, Snowflake, DBT (Nice to have) specific experience a plus. Hands-on experience with data warehousing, data lake/lakehouse solutions, data pipelines (ELT/ETL), SQL, Spark/PySpark, DBT,. Strong understanding of Data Modelling, SDLC, Agile, and DevOps principles. Bachelors degree in management/computer information systems, computer science, accounting information systems, computer or in a relevant field. Knowledge/Skills: Tools and Technologies: Azure Databricks, Apache Spark, Python, Databricks SQL, Unity Catalog, and Delta Live Tables. Understanding of cluster configuration, compute and storage layers. Expertise with Snowflake Architecture, with experience in design, development, and evolution System integration experience, data extraction, transformation, and quality controls design techniques. Familiarity with data science concepts, as well as MDM, business intelligence, and data warehouse design and implementation techniques. Extensive experience with the medallion architecture data management framework as well as unity catalog. Data modeling and information classification expertise at the enterprise level. Understanding of metamodels, taxonomies and ontologies, as well as of the challenges of applying structured techniques (data modeling) to less-structured sources. Ability to assess rapidly changing technologies and apply them to business needs. Be able to translate the information architecture contribution to business outcomes into simple briefings for use by various data-and-analytics-related roles. About Us Datavail is a leading provider of data management, application development, analytics, and cloud services, with more than 1,000 professionals helping clients build and manage applications and data via a world-class tech-enabled delivery platform and software solutions across all leading technologies. For more than 17 years, Datavail has worked with thousands of companies spanning different industries and sizes, and is an AWS Advanced Tier Consulting Partner, a Microsoft Solutions Partner for Data & AI and Digital & App Innovation (Azure), an Oracle Partner, and a MySQL Partner. About The Team Datavails Data Management Services: Datavails Data Management and Analytics practice is made up of experts who provide a variety of data services including initial consulting and development, designing and building complete data systems, as well as ongoing support and management of database, data warehouse, data lake, data integration, and virtualization and reporting environments. Datavails team is comprised of not just excellent BI & analytics consultants, but great people as well. Datavails data intelligence consultants are experienced, knowledgeable and certified in the best in breed BI and analytics software applications and technologies. We ascertain your business objectives, goals and requirements, assess your environment, and recommend the tools which best fit your unique situation. Our proven methodology can help your project succeed, regardless of stage. With the combination of a proven delivery model and top-notch experience ensures that Datavail will remain the Data Management experts on demand you desire. Datavails flexible and client focused services always add value to your organization. Show more Show less
Posted 2 days ago
8.0 - 15.0 years
0 Lacs
karnataka
On-site
This is a hands-on Databricks Senior Developer position within State Street Global Technology Services. We are seeking a candidate with a strong understanding of Bigdata technology and significant development expertise with Databricks. In this role, you will be responsible for managing the Databricks platform for the application, implementing enhancements, performance improvements, and AI/ML use cases, as well as leading a team. As a Databricks Sr. Developer, your responsibilities will include designing and developing custom high throughput and configurable frameworks/libraries. You should possess the ability to drive change through collaboration, influence, and the demonstration of proof of concepts. Additionally, you will be accountable for all aspects of the software development lifecycle, from design and coding to integration testing, deployment, and documentation. Collaboration within an agile project team is essential, and you must ensure that best practices and coding standards are adhered to by the team. Providing technical mentoring to the team and overseeing the ETL team are also key aspects of this role. To excel in this position, the following skills are highly valued: data analysis and data exploration experience, familiarity with agile delivery environments, hands-on development skills in Java, exposure to DevOps best practices and CICD (such as Jenkins), proficiency in working within a multi-developer environment using version control (e.g., Git), strong knowledge of Databricks SQL/Pyspark for data engineering pipelines, expertise in Unix, Python, and complex SQL, as well as strong critical thinking, communication, and problem-solving abilities. Troubleshooting DevOps pipelines and experience with AWS services are also essential. The ideal candidate will hold a Bachelor's degree in a computer or IT-related field, with at least 15 years of overall Bigdata data pipeline experience, 8+ years of hands-on experience with Databricks, and 8+ years of cloud-based development expertise, including AWS Services. Job ID: R-774606,
Posted 4 days ago
8.0 - 13.0 years
30 - 45 Lacs
Hyderabad
Work from Office
Role : Were looking for a skilled Databricks Solution Architect who will lead the design and implementation of data migration strategies and cloud-based data and analytics transformation on the Databricks platform. This role involves collaborating with stakeholders, analyzing data, defining architecture, building data pipelines, ensuring security and performance, and implementing Databricks solutions for machine learning and business intelligence. Key Responsibilities: Define the architecture and roadmap for cloud-based data and analytics transformation on Databricks. Design, implement, and optimize scalable, high-performance data architectures using Databricks. Build and manage data pipelines and workflows within Databricks. Ensure that best practices for security, scalability, and performance are followed. Implement Databricks solutions that enable machine learning, business intelligence, and data science workloads. Oversee the technical aspects of the migration process, from planning through to execution. Create documentation of the architecture, migration processes, and solutions. Provide training and support to teams post-migration to ensure they can leverage Databricks. Preferred candidate profile: Experience: 7+ years of experience in data engineering, cloud architecture, or related fields. 3+ years of hands-on experience with Databricks, including the implementation of data engineering solutions, migration projects, and optimizing workloads. Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their integration with Databricks. Experience in end-to-end data migration projects involving large-scale data infrastructure. Familiarity with ETL tools, data lakes, and data warehousing solutions. Skills: Expertise in Databricks architecture and best practices for data processing. Strong knowledge of Spark, Delta Lake, DLT, Lakehouse architecture, and other latest Databricks components. Proficiency in Databricks Asset Bundles Expertise in design and development of migration frameworks using Databricks Proficiency in Python, Scala, SQL, or similar languages for data engineering tasks. Familiarity with data governance, security, and compliance in cloud environments. Solid understanding of cloud-native data solutions and services.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Chennai
Remote
Role & responsibilities Develop, maintain, and enhance new data sources and tables, contributing to data engineering efforts to ensure comprehensive and efficient data architecture. Serves as the liaison between Data Engineer team and the Airport operation teams, developing new data sources and overseeing enhancements to existing database; being one of the main contact points for data requests, metadata, and statistical analysis Migrates all existing Hive Metastore tables to Unity Catalog, addressing access issues and ensuring smooth transition of jobs and tables. Collaborate with IT teams to validate package (gold level data) table outputs during the production deployment of developed notebooks Develop and implement data quality alerting systems and Tableau alerting mechanisms for dashboards, setting up notifications for various thresholds. Create and maintain standard reports and dashboards to provide insights into airport performance, helping guide stations to optimize operations and improve performance. Preferred candidate profile Master's degree / UG Min 5 -10 years of experience Databricks (Azur op) Good Communication Experience developing solutions on a Big Data platform utilizing tools such as Impala and Spark Advanced knowledge/experience with Azure Databricks, PySpark , ( Teradata )/Databricks SQL Advanced knowledge/experience in Python along with associated development environments (e.g. JupyterHub, PyCharm, etc.) Advanced knowledge/experience in building Tableau Dashboard / Clikview / PowerBi Basic idea on HTML and JavaScript Immediate Joiner Skills, Licenses & Certifications Strong project management skills Proficient with Microsoft Office applications (MS Excel, Access and PowerPoint); advanced knowledge of Microsoft Excel Advanced aptitude in problem-solving, including the ability to logically structure an appropriate analytical framework Proficient in SharePoint, PowerApp and ability to use Graph API
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough