Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 12.0 years
13 - 20 Lacs
Chennai
Work from Office
Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI
Posted 1 month ago
10.0 - 12.0 years
13 - 20 Lacs
Bengaluru
Work from Office
Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI
Posted 1 month ago
10.0 - 12.0 years
13 - 20 Lacs
Jaipur
Work from Office
Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI
Posted 1 month ago
10.0 - 20.0 years
45 - 55 Lacs
Noida, Hyderabad, Gurugram
Work from Office
Data Architect Telecom Domain To design comprehensive data architecture and technical solutions specifically for telecommunications industry challenges, leveraging TMforum frameworks and modern data platforms. To work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Responsibilities: Design and articulate enterprise-scale telecom data architectures incorporating TMforum standards and frameworks, including SID (Shared Information/Data Model), TAM (Telecom Application Map), and eTOM (enhanced Telecom Operations Map) Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management Create data architectures that support telecom-specific use cases including customer journey analytics, network performance optimization, fraud detection, and revenue assurance Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Conduct technical discovery sessions with telecom clients to understand their OSS/BSS architecture, network analytics needs, customer experience requirements, and digital transformation objectives Design and deliver proof of concepts (POCs) and technical demonstrations showcasing modern data platforms solving real-world telecommunications challenges Create comprehensive architectural diagrams and implementation roadmaps for telecom data ecosystems spanning cloud, on-premises, and hybrid environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on telecom-specific requirements and regulatory compliance needs. Design data governance frameworks compliant with telecom industry standards and regulatory requirements (GDPR, data localization, etc.) Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development Qualifications: Bachelor's or Master's degree in Computer Science, Telecommunications Engineering, Data Science, or a related technical field 10+ years of experience in data architecture, data engineering, or solution architecture roles with at least 5 years in telecommunications industry Deep knowledge of TMforum frameworks including SID (Shared Information/Data Model), eTOM, TAM, and their practical implementation in telecom data architectures Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines for complex telecom data initiatives Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes Strong understanding of telecom OSS/BSS systems, network management, customer experience management, and revenue management domains Hands-on experience with data platforms including Databricks, and Microsoft Azure in telecommunications contexts Experience with modern data processing frameworks such as Apache Kafka, Spark and Airflow for real-time telecom data streaming Proficiency in Azure cloud platform and its respective data services with an understanding of telecom-specific deployment requirements Knowledge of system monitoring and observability tools for telecommunications data infrastructure Experience implementing automated testing frameworks for telecom data platforms and pipelines Familiarity with telecom data integration patterns, ETL/ELT processes, and data governance practices specific to telecommunications Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for telecom use cases Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) with telecom domain applications Understanding of telecommunications regulatory requirements and data privacy compliance (GDPR, local data protection laws) Excellent communication and presentation skills with ability to explain complex technical concepts to telecom stakeholders Strong problem-solving skills and ability to think creatively to address telecommunications industry challenges Good to have TMforum certifications or telecommunications industry certifications Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus Willingness to travel as required if you will all or most of the criteria contact bdm@intellisearchonline.net M 9341626895
Posted 1 month ago
15.0 - 19.0 years
40 - 45 Lacs
Pune
Work from Office
Skill Name - Data Architect with Azure & Databricks + Power BIExperience: 15 - 19 years Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.
Posted 1 month ago
10.0 - 15.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Work from Office
Roles and responsibilities Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Work with the leadership to set the standards for software engineering practices within the machine learning engineering team and support across other disciplines Play an active role in leading team meetings and workshops with clients. Choose and use the right analytical libraries, programming languages, and frameworks for each task. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Help us to shape the next generation of our products. What Were Looking For Total experience in data management area for 10 + years’ experience in the implementation of modern data ecosystems in AWS/Cloud platforms. Strong experience with AWS ETL/File Movement tools (GLUE, Athena, Lambda, Kinesis and other AWS integration stack) Strong experience with Agile Development, SQL Strong experience with Two or Three AWS database technologies (Redshift, Aurora, RDS,S3 & other AWS Data Service ) covering security, policies, access management Strong programming Experience with Python and Spark Strong learning curve for new technologies Experience with Apache Airflow & other automation stack. Excellent with Data Modeling. Excellent oral and written communication skills. A high level of intellectual curiosity, external perspective, and innovation interest Strong analytical, problem solving and investigative skills Experience in applying quality and compliance requirements. Experience with security models and development on large data sets
Posted 1 month ago
7.0 - 12.0 years
16 - 31 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
DATA ARCHITECT- Data Architecture, Big Data, Data Modeling, or Database Administration, any DBMS,Oracle/SQL Server/PostgreSQL/MySQL Database Expert-Database Mgmt, SQL, Data Modeling, Data Warehousing, ETL, any DBMS Oracle/SQL Server/PostgreSQL/MySQL
Posted 1 month ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Hybrid
Additional Career Level Description Knowledge and application Applies advanced wide-ranging experience and in-depth professional knowledge to develop and resolve complex models and procedures in creative way . Directs the application of existing principles and guides development of new policies and ideas. Determines own methods and procedures on new assignments . Problem solving Understands and works on complex issues where analysis of situation or data requires an in-depth evaluation of variable factors, solutions may need to be devised from limited informatio n. Exercises judgment in selecting methods, evaluating, adapting of complex techniques and evaluation criteria for obtaining results. Interaction Frequently advises key people outside own area of expertise on complex matters, using persuasion in delivering messages. Impact Develops and manages operational initiatives to deliver tactical results and achieve medium-term goals. Accountability May be accountable through team for delivery of tactical business targets . Work is reviewed upon completion and is consistent with departmental objectives.
Posted 1 month ago
5.0 - 10.0 years
35 - 50 Lacs
Bengaluru
Hybrid
Skill Name : SQL Server 2012/SQL Server Engineer (5 - 10 years) Work Location : Bangalore (preferred) / Any Deloitte USI location Key responsibilities: Understand customer use case, available data to prepare for automated & ongoing data ingestion to meet customer requirements • Design technical solution with all data ingestion, transformation & scheduling to review & sign-off with customer • Play role of the tech lead and responsible for end-to-end technical solutions • Identifying AEP enhancements, extending frameworks and incorporating new ideas • Closely collaborating with other AEP team members (sales teams, engineers, consultants) and onshore teams for delivering projects • Enterprise level software development leveraging Big Data, Cloud technologies & Python • Building/operating highly scalable, fault tolerant, distributed systems for extraction, ingestion to process large data sets • Experience with data analysis, modeling and mapping to coordinate closely with Data Architect(s) • Build the necessary schemas, workflows to ingest customers data into AEP successfully • Create necessary identity namespaces, privacy settings and merge poilicys required to build the solution • Build Audiences (segmentations) and create necessary pipeline for Destination activation • Deploying the final solution to a production environment (or end-state environment) • Post-Deployment, provide ongoing maintenance & support of solution & knowledge transfer to offshore support team
Posted 1 month ago
6.0 - 12.0 years
11 - 15 Lacs
Pune
Work from Office
As a Data Architect, you'll design and optimize data architecture to ensure data is accurate, secure, and accessible. you'll collaborate across teams to shape the data strategy, implement governance, and promote best practices enabling the business to gain insights, innovate, and make data-driven decisions at scale. Your responsibilites Responsible for defining the enterprise data architecture which streamlines, standardises, and enhances accessibility of organisational data. Elicits data requirements from senior Business stakeholders and the broader IS function, translating their needs into conceptual, logical, and physical data models. Oversees the effective integration of data from various sources, ensuring data quality and consistency. Monitors and optimises data performance, collaborating with Data Integration and Product teams to deliver changes that improve data performance. Supports the Business, Data Integration Platforms team and wider IS management to define a data governance framework that sets out how data will be governed, accessed, and secured across the organisation; supports the operation of the data governance model as a subject matter advisor. Provides advisory to Data Platform teams in defining the Data Platform architecture, providing advisory on metadata, data integration, business intelligence, and data storage needs. Supports the Data Integration Platforms team, and other senior IS stakeholders to define a data vision and strategy, setting out how the organisation will exploit its data for maximum Business value. Builds and maintains a repository of data architecture artefacts (eg, data dictionary). What we're Looking For Proven track record in defining enterprise data architectures, data models, and database/data warehouse solutions. Evidenced ability to advise on the use of key data platform architectural components (eg, Azure Lakehouse, Data Bricks, etc) to deliver and optimise the enterprise data architecture. Experience in data integration technologies, real-time data ingestion, and API-based integrations. Experience in SQL and other database management systems. Strong problem-solving skills for interpreting complex data requirements and translating them into feasible data architecture solutions and models. Experience in supporting the definition of an enterprise data vision and strategy, advising on implications and/or uplifts required to the enterprise data architecture. Experience designing and establishing data governance models and data management practices, ensuring data is correct and secure whilst still being accessible, in line with regulations and wider organisational policies. Able to present complex data-related initiatives and issues to senior non-data conversant audiences. Proven experience working with AI and Machine Learning models preferred, but not essential. What We Can Offer You We support your growth within the role, department, and across the company through internal opportunities. We offer a hybrid working model, allowing you to combine remote work with the opportunity to connect with your team in modern, welcoming office spaces. We encourage continuous learning with access to online platforms (eg, LinkedIn Learning), language courses, soft skills training, and various we'llbeing initiatives, including workshops and webinars. Join a diverse and inclusive work environment where your ideas are valued and your contributions make a difference.
Posted 1 month ago
10.0 - 15.0 years
30 - 40 Lacs
Pune, Bengaluru
Hybrid
Job Role & responsibilities: - Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves architecture designing, building and deploying data systems, pipelines etc Designing and implementing agile, scalable, and cost efficiency solution on cloud data services. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skill, Qualification & experience required:- 9-11 years of experience in Cloud Data Engineering. Experience in Azure Cloud Data Engineering, Azure Databricks, datafactory , Pyspark, SQL,Python Hands on experience in Data Engineer, Azure Databricks, Data factory, Pyspark, SQL Proficient in Cloud Services Azure Architect and implement ETL and data movement solutions. Bachelors/Master's Degree in Computer Science or related field Design and implement data solutions using medallion architecture, ensuring effective organization and flow of data through bronze, silver, and gold layers. Optimize data storage and processing strategies to enhance performance and data accessibility across various stages of the medallion architecture. Collaborate with data engineers and analysts to define data access patterns and establish efficient data pipelines. Develop and oversee data flow strategies to ensure seamless data movement and transformation across different environments and stages of the data lifecycle. Migrate data from traditional database systems to Cloud environment Strong hands-on experience for working with Streaming dataset Building Complex Notebook in Databricks to achieve business Transformations. Hands-on Expertise in Data Refinement using Pyspark and Spark SQL Familiarity with building dataset using Scala. Familiarity with tools such as Jira and GitHub Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills * Immediate Joiners will be preferred only
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Surat
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Kolkata
Work from Office
IT & Technology Senior Manager Data Analytics Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Hyderabad
Work from Office
IT & Technology Senior Manager Data Analytics Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience
Posted 1 month ago
9.0 - 12.0 years
50 - 55 Lacs
Pune
Work from Office
KPI Partners, a leader in providing analytics and data management solutions, is seeking a highly skilled **Senior Data Architect** to join our dynamic team. This position offers an exciting opportunity to work on innovative data solutions that drive business value for our clients. You will be responsible for designing, developing, and implementing data architectures that align with our organizational goals and client requirements. Key Responsibilities: - Lead the design and implementation of data architecture solutions, ensuring alignment with best practices and compliance standards. - Develop comprehensive data models to support different business applications and analytical needs. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. - Oversee the integration of SAP and other data sources into the Datasphere platform. - Create strategies for data governance, quality assurance, and data lifecycle management. - Ensure scalability and performance of data systems through efficient architecture practices. - Mentor and guide junior data professionals in data architecture and modeling best practices. - Stay updated with industry trends and emerging technologies in data architecture and analytics. Required Skills: - 9 to 12 years of experience in data architecture, data modeling, and data management. - Strong expertise in SAP systems and data integration processes. - Proficient in DataSphere or similar data management platforms. - Solid understanding of data governance, data warehousing, and big data technologies. - Excellent analytical and problem-solving abilities. - Strong communication skills to engage with technical and non-technical stakeholders. - Ability to work independently and as part of a team in a fast-paced environment. Preferred Qualifications: - bachelors or masters degree in computer science, Information Systems, or a related field. - Relevant certifications in data architecture or SAP technologies. - Experience in cloud data solutions and platform migrations is a plus. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative work environment fostering innovation and creativity. - The chance to work with cutting-edge technology and notable clients.
Posted 1 month ago
12.0 - 18.0 years
20 - 27 Lacs
Bengaluru
Work from Office
Develop and deliver detailed technology solutions through consulting project activities. Evaluate and recommend emerging cloud data technologies and tools to drive innovation and competitive advantage, with a focus on Azure services such as Azure Data Lake, Azure Synapse Analytics, and Azure Databricks. Lead the design and implementation of cloud-based data architectures using Azure and Databricks to support the companys strategic initiatives and exploratory projects. Collaborate with cross-functional teams to understand business requirements, architect data solutions, and drive the development of innovative data platforms and analytics solutions. Define cloud data architecture standards, best practices, and guidelines to ensure scalability, reliability, and security across the organization. Design and implement data pipelines, ETL processes, and data integration solutions to ingest, transform, and load structured and unstructured data from multiple sources into Azure data platforms. Provide technical leadership and mentorship to junior team members, fostering a culture of collaboration, continuous learning, and innovation in cloud data technologies. Collaborate with Azure and Databricks experts within the organization and the broader community to stay abreast of the latest developments and best practices in cloud data architecture and analytics.
Posted 1 month ago
12.0 - 14.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Position Summary Experienced Senior Data Engineer utilizing Big Data & Gogle Cloud technologies to develop large scale, on-cloud data processing pipelines and data warehouses. What you ll do Consult customers across the world on their data engineering needs around Adobes Customer Data Platform. Support pre-sales discsusions around complex and large scale cloud, data engineering solutions. Design custom solutions on cloud integrating Adobes solutions in scalable and performant manner. Deliver complex, large scale, enterprise grade on-clould data engineer and integration solutions in hand-on manner. Good to have Experience of consulting India customers. Multi-cloud expertise preferable AWS and GCP EXPERIENCE 12-14 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Python, BigQuery
Posted 1 month ago
9.0 - 12.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Experience: 9 - 12 Years Location: Bangalore / Hyderabad Notice Period: Immediate to 15 Days Overview We are looking for a highly experienced and strategic Snowflake Data Architect to lead the transformation and modernization of our data architecture. You will be responsible for designing scalable, high-performance data solutions and ensuring seamless data quality and integration across the organization. This role requires close collaboration with data modelers, business stakeholders, governance teams, and engineers to develop robust and efficient data architectures. This is an excellent opportunity to join a dynamic, innovation-driven environment with significant room for professional growth. We encourage initiative, creative problem-solving, and a proactive approach to optimizing our data ecosystem. Responsibilities Architect, design, and implement scalable data solutions using Snowflake . Build and maintain efficient data pipelines using SQL and ETL tools to integrate data from multiple ERP and other source systems. Leverage data mappings, modeling (2NF/3NF) , and best practices to ensure consistent and accurate data structures. Collaborate with stakeholders to gather requirements and design data models that support business needs. Optimize and debug complex SQL queries and ensure performance tuning of pipelines. Create secure, reusable, and maintainable components for data ingestion and transformation workflows. Implement and maintain data quality frameworks , ensuring adherence to governance standards. Lead User Acceptance Testing (UAT) support, production deployment activities, and manage change requests. Produce comprehensive technical documentation for future reference and auditing purposes. Provide technical leadership in the use of cloud platforms (Snowflake, AWS) and support teams through knowledge transfer. Requirements Bachelor s degree in Computer Science, Information Technology, or a related field. 9 to 12 years of overall experience in data engineering and architecture roles. Strong, hands-on expertise in Snowflake with a solid understanding of its advanced features. Proficient in advanced SQL with extensive experience in data transformation and pipeline optimization . Deep understanding of data modeling techniques , especially 2NF/3NF normalization . Experience with cloud-native platforms , especially AWS (S3, Glue, Lambda, Step Functions) is highly desirable. Knowledge of ETL tools (Informatica, Talend, etc.) and working in agile environments . Familiarity with structured deployment workflows (e.g., Carrier CAB process). Strong debugging, troubleshooting , and analytical skills . Excellent communication and stakeholder management skills. Key Skills Snowflake (Advanced) SQL (Expert) Data Modeling (2NF/3NF) ETL Tools AWS (S3, Glue, Lambda, Step Functions) Agile Development Data Quality & Governance Performance Optimization Technical Documentation Stakeholder Collaboration
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Jaipur
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Ahmedabad
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Mumbai
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 1 month ago
9.0 - 14.0 years
8 - 18 Lacs
Bhopal, Hyderabad, Pune
Hybrid
Urgent Opening for Data Architect Position Exp : Min 9yrs Salary : As per industry NP : Immediate joiners are preferred Skills : Databricks, ADF, Python, Synapse JD Job Description Work youll do This role is responsible for data architecture and support in a fast-paced cross-cultural diverse environment leveraging the Agile methodology. This role requires a solid understanding of taking business requirements to define data models and underlying data structures that support a data architecture's design. The person that fills this role is expected to work closely with product owners and a cross-functional team comprising business analysts, software engineers, functional and non- functional testers, operations engineers and project management. Key Responsibilities Collaborate with product managers, designers, and fellow developers to design, develop, and maintain web-based applications and software solutions. Write clean, maintainable, and efficient code, adhering to coding standards and best practices. Perform code reviews to ensure code quality and consistency. Troubleshoot and debug software defects and issues, providing timely solutions. Participate in the entire software development lifecycle, from requirements gathering to deployment and support. Stay up to date with industry trends and emerging technologies, incorporating them into the development process when appropriate. Mentor junior developers and provide technical guidance when needed. Qualifications : Technical Skills: Bachelor's degree in computer science, Software Engineering, or a related field. Strong experience in Data Engineering Well versed with Data Architecture 10+ years of professional experience in data engineering & architecture Advanced understanding data modelling and design. Strong database management and design, SQL Server preferred. Strong understanding of Databricks & Azure Synapse. Understanding of data pipeline/ETL frameworks and libraries. Experience with Azure Cloud Components (PaaS) and DevOps is required. Experience working in Agile and SAFe development processes. Excellent problem-solving and analytical skills. Other Skills Strong organizational and communication skills. Flexibility, energy and ability to work well with others in a team environment The ability to effectively manage multiple assignments and responsibilities in a fast-paced environment Expert problem solver. Finding simple answers to complex questions or problems. Should be able to learn and upskill on new technologies Drive for results partner with product owners to deliver on short- and long-term milestones Experience working with product owners and development teams to document and clarify business and user requirements and manage scope of defined features and functions during project lifecycle Critical thinking-able to think outside the box; use knowledge gained through prior experience, education, and training to resolve issues and remove project barriers Strong written and verbal communication skills with the ability to present and to collaborate with business leaders Experience interfacing with external software design and development vendors preferred Being a team player that can deliver in a high pressure and high demanding environment If anyone interested, Kindly share your resume on vidya.raskar@newvision-software.com Regards Vidya
Posted 1 month ago
10.0 - 20.0 years
10 - 20 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Data Architect, Snowflake Database Architect/Data Modeler with 15+ years of experience in AWS, Snowflake. Responsible for the overall architecture and delivery of the solution; including, risk mitigation, issue resolution, coordination with internal and external team.
Posted 1 month ago
5.0 - 11.0 years
7 - 13 Lacs
Bengaluru
Work from Office
Hiring for Data Architect- Bangalore Designs and manages an organizations data infrastructure, ensuring data accuracy, accessibility, and security for strategic decision-making . They collaborate with business stakeholders, understand data requirements, and translate them into technical specifications for data models, data warehouses, and BI solutions. Their role involves designing ETL processes, data visualization, and implementing data governance policies.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough