Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7 - 12 years
20 - 35 Lacs
Chennai, Hyderabad
Work from Office
Data Architect / Solution Architect Chennai, hyderabad Experience:- 7+Years Experience in maintaining data warehouses (e.g., Redshift, Teradata) Data warehouse development exp using ETL tools (e.g. SAS DI, Glue, DBT) sneha.parashar@biitservices.com
Posted 3 months ago
16 - 18 years
27 - 37 Lacs
Bengaluru
Hybrid
Designation Business Line : Advisory - Digital - DA - Enterprise Delivery Architect Job Summery : This position will work with cutting edge technology, deliver high quality solutions across various industries, and oversee team(s) on engagements that range in size and scope. This position will receive continuous career development opportunities, given the size and potential of client engagements. This role will support actualizing of a business strategic vision by understanding the current state of technology, performing gap analyses, and developing plans to close the gaps. This position will ultimately own consulting relationships across multiple clients and technologies. The Enterprise Architect is an entrepreneurial-minded individual who works with our clients to define and implement their long-term information architectures and strategies. A successful candidate is someone who believes nothing is impossible, has a broad experience with systems, platforms, and integrations, can establish a clear vision of our clients needs and put a plan into action, is a self-starter, and views challenges and obstacles as opportunities for growth and reward. This position will collaborate with the Data Analytics Practice at BDO Digital(USA). BDO Digital is an IT advisory and technology services consulting organization. Core Roles and Responsibilities : Enterprise Architecture: Leads, architects, and defines and implements best in class the technical architecture for data ingestion strategies, data warehouse and data mart structures, semantic layers and model on primarily, but not limited to, cloud analytics platforms Visualization Leads, design, and defines and implements best in class business intelligence strategies, methodologies Implementation and Support Supports creation of written functional, architectural, and technical designs. Responsible for overall quality and accurate implementation of deliverables from the technical team. Consulting : Manages and oversees client needs, analyzes best practices and plays an integral part in defining and proposing business value solutions, to meet or exceed client expectations Other duties as required Supervisory Responsibilities: Development Team Lead: Responsible for design and technical oversight of a team of developers Mentoring and Coaching: Responsible for ensuring the team is properly supported on projects, and provides direct support as needed. Project Oversight: Hands-on management of project teams and ensures timelines and deliverables are met Oversees time reporting accuracy and the technical management of projects Performs delegation, code reviews, and ensures adherence to standards Responsible for project delivery to ensure that timelines and deliverables are met Responsible for overall quality and accurate implementation of deliverables from the technical Technical Set Requirements : Education: Bachelors degree from an accredited university, required; degree with a focus in Computer Science, preferred Experience: Ten (10) or more years of data engineering experience, required Five (5) or more years of data visualization experience, DAX programming, API, Data Model, SQL, Story Telling and wireframe design, required Four (4) or more years of technically leading development projects, required Four (4) or more years producing technical designs (artifacts), preferred Professional experience in using PySpark, Snowflake, SQL and SQL Server, Power BI required knowledge in Data Lake/Data Warehousing/ Big data tools, Apache Spark, RDBMS and NoSQL, Knowledge Graph Strong experience of Azure Data Services such as Azure Data Factory, Azure Synapse, Azure Databricks, Azure SQL Database, Azure Cosmos DB, Azure Blob Storage/Data Lake Storage, and Azure Stream Analytics. Good understanding of ETL cycle Core Skillet Required - Pyspark, SQL, SQL Server, Azure Synapse, Azure Fabric, Python
Posted 3 months ago
10 - 15 years
32 - 37 Lacs
Trivandrum
Work from Office
Design and implement robust data architectures on Azure, Google Cloud, or on-premises platforms like Cloudera. Develop and optimize data pipelines, architectures, and data sets for real-time data integration. Ensure data architecture aligns with business requirements and supports the creation of data products. Apply expertise in SQL and data modelling to manage and manipulate large, complex data sets. Utilize data virtualization tools to provide efficient access to various data sources. Implement Master Data Management (MDM) and Data Quality frameworks to maintain high data integrity. Collaborate with cross-functional teams to integrate industry-standard data sources and systems such as PI, SAP, LIMS, SPF, Maximo, SharePoint, SCADA, and HYSYS. Uphold data governance and data integrity principles specific to the Oil and Gas/Power and Utility/Manufacturing industry. Stay current with emerging technologies and industry trends to drive innovation within the team. Qualifications 10+ years of experience in data architecture or a related field. Proven experience with Databricks and cloud technologies (Azure/Google Cloud). Familiarity with on-premises data solutions like Cloudera. Extensive experience in the Oil and Gas/Power and Utility/Manufacturing. Strong proficiency in SQL and experience with real-time data integration. Solid understanding of Data Products, Data Fabric, and Data virtualization tools. Knowledge of MDM and Data Quality principles. Proficiency with industry-standard data sources and systems (PI, SAP, LIMS, SPF, Maximo, SharePoint, SCADA, HYSYS). Excellent knowledge of data modelling, data governance, and data integrity principles. Bachelors or master s degree in computer science, Engineering, or a related field.
Posted 3 months ago
6 - 10 years
0 - 2 Lacs
Pune
Hybrid
Data Architect will be responsible for designing, building, and optimizing data systems to ensure efficient data storage, integration, and accessibility.
Posted 3 months ago
9 - 12 years
13 - 17 Lacs
Pune
Work from Office
As a Cloud Data Architect you will: Work with all levels of staff, management, stakeholders, and vendors Display advanced analytical and conceptual skills to create original concepts for various projects Design and develop platforms for scalable solutions Provide forward thinking technical expertise in current and emerging technologies, trends, and practices Anticipate internal and/or external business challenges including regulations, recommending processes, and product or service enhancements Design, prototype, and develop Big Data Platforms utilizing Azure or AWS Translate business goals and objectives into architectural strategies and technical solutions Serve in a lead capacity on client initiatives Mentor junior Data Architects and Data Engineers on Clarkston s analytics team Assist in defining and managing client solutions How You ll Grow Beyond your day-to-day responsibilities, throughout your career at Clarkston you will: Receive the support and mentorship of your Clarkston colleagues and leaders Expand your existing skillset with internal and external professional development opportunities What We re Looking For: 9+ yrs of experience with data architecture, data engineering and data modeling, specifically in support of data warehousing, business intelligence, master data management, and reference data management Experience with data management solutions using data lake platform tools and technologies Architect modern data platforms that leverage the data lake platform, data engineering frameworks and Databricks, preferably in an Azure cloud environment. Strong Data Modeling, Data Engineering background is a must along with SQL, PySpark & Python proficiency. Hands-on experience designing large data products leveraging Azure Databricks and Azure Data Factory(ADF).Familiarity with delivering Data as a Service (DaaS) concepts from enterprise repositories using REST APIs Familiarity with Azure or AWS-based data and data management tools, such as Azure SQL Azure Data Catalog, Azure Service Fabric, and microservices Familiarity with data integration points for TOGAF (or other recognized architecture framework) Working knowledge of development or operations framework methodologies, including Information Technology Infrastructure Library (ITIL), Systems Development Life Cycle (SDLC), etc. Familiarity with developing data catalogues, and integrating with existing enterprise architecture Familiarity with cloud data platforms and process automation Experience in using various data modeling techniques and their appropriate usage in data integration Experience in working with all levels of staff, management, stakeholders, and vendors Significant experience in performing analysis of business processes and functions Significant experience in work as an engineer or technical subject matter expert in two or more of the following areas: infrastructure; data; application; and/or security related technology Effective skill creating system diagrams for review by both technical and non-technical audiences Advanced analytical and conceptual skills to create original concepts/theories for various projects Develop and apply architectural governance based on business and information technology strategies Effective skill applying innovative approaches to solve technical design issue Anticipate internal and/or external business challenges including regulatory; recommend process, product or service enhancements Provide forward thinking technical expertise in current and emerging technologies, trends and practices 4-year degree in Business Administration, Computer Science, Information Systems, Engineering, or an equivalent discipline
Posted 3 months ago
10 - 14 years
12 - 17 Lacs
Hyderabad
Work from Office
Designation: Associate Data Architect Experience: 10 to 14 yrs Work Location: Hyderabad, Telangana Qualification: bachelor s degree in computer science, Information Systems, or related field Brief description of the role: The Data Architect will be responsible for Solutioning, and designing the processes and systems that are used to move data from various sources into a data warehouse or other target systems using Informatica platform technologies. The candidate should be able to design and develop the architecture for a robust ETL/ELT platform using on-prem or cloud-based solutions. He should be able to mentor, coach, and lead a team of developers and leads in implementing such solutions. Key Roles & Responsibilities: Develop and communicate the organizations data strategy, aligning it with business goals and objectives. Define data architecture principles, standards, and best practices to guide data management across the organization. Collaborate with business stakeholders, analysts, and IT teams to understand data requirements and drive data initiatives. Design and create logical and physical data models that capture the structure and relationships of data entities. Ensure data models adhere to data normalization, denormalization, and indexing principles for optimal performance. Select appropriate database technologies and solutions based on data requirements, volume, and access patterns. Design and create database schemas, tables, views, and indexes for efficient data storage and retrieval. Define data integration strategies and methods for bringing together data from various sources, both internal and external. Design and implement ETL (Extract, Transform, Load) process to move and transform data between systems. Establish data governance practices, including data ownership, data lineage, and data access controls. Implement data quality initiatives to ensure accurate and reliable data across systems. Define and enforce data security policies, access controls, and encryption measures to protect sensitive data. Ensure compliance with data protection regulations and industry standards. Monitor and optimize database performance, including query optimization, indexing, and caching strategies. Identify and address bottlenecks and performance issues in the data architecture. Evaluate and integrate big data technologies and cloud services to accommodate the organizations growing data needs. Design data pipelines and architectures for processing and analyzing large-scale data. Collaborate with cross-functional teams, including business analysts, data engineers, developers, and business stakeholders. Communicate complex technical concepts to non-technical stakeholders in a clear and understandable manner. Stay up to date with emerging data technologies, trends, and best practices. Identify opportunities to innovate and leverage new data tools and techniques to enhance the data architecture. Maintain detailed documentation of data architecture, data flows, data models, and processes. Develop and implement MDM strategy in collaboration with stakeholders. Skills 7+ years of experience in ETL, data warehousing, or related field using Informatica s suite of products and solutions. 5+ years of experience working on AWS technologies. Hands-on experience using scheduling tools like Apache Airflow. Hands-on experience using CI/CD tools like GitHub, code commit, Azure DevOps, etc. Hands-on experience in building data solutions using AWS technologies. Strong understanding of data governance, data quality, and data management best practices Good experience in Unix shell scripting and Python. In-depth understanding of data warehousing concepts and their applications. In-depth understanding of Data lakes, Delta Lake, and lake house implementation. In-depth understanding of CDC, SCDs, and their implementations. In-depth expertise in handling structured, semi-structured, and unstructured files. Experience of working in cloud data warehouses such as Snowflake, Redshift, and Databricks. Strong data analysis skills. Proficient in writing complex and efficient SQL queries and tuning them for performance when required. In-depth understanding of data integration tools and technologies such as Informatica PowerCenter, Informatica Data Quality, or IDMC. Good hanndson experience on Python will be preferred. Excellent communication, collaboration, and leadership skills Experience with data analytics and data visualization tools, such as Tableau, Power BI, or QlikView, is a plus. Experience with data privacy and security regulations, such as GDPR or CCPA, is a plus. TOGAF certification will be an added advantage. Hands-on expertise required with Informatica tools and utilities such as Data Controls, Data Director, and Data Quality preferably. Strong understanding of iterative development methodologies (Agile, Scrum).
Posted 3 months ago
15 - 19 years
40 - 50 Lacs
Hyderabad
Work from Office
Role: To The ideal professional for this Cloud Architect role will: Have a passion for design, technology, analysis, collaboration, agility, and planning, along with a drive for continuous improvement and innovation. Exhibit expertise in managing high-volume data projects that leverage Cloud Platforms, Data Warehouse reporting and BI Tools, and the development of relational databases. Research, identify, and internally market enabling data management technologies based on business and end-user requirements. Seek ways to apply new technology to business processes with a focus on modernizing the approach to data management. Consult with technical subject matter experts and develop alternative technical solutions. Advise on options, risks, costs versus benefits, and impact on other business processes and system priorities. Demonstrate strong technical leadership skills and the ability to mentor others in related technologies. Qualifications Bachelor's degree in a computer-related field or equivalent professional experience is required. Preferred masters degree in computer science, information systems or related discipline, or equivalent and extensive related project experience. 10+ years of hands-on software development experience building data platforms with tools and technologies such as Hadoop, Cloudera, Spark, Kafka, Relational SQL, NoSQL databases, and data pipeline/workflow management tools. 6+ years of experience working with various cloud platforms (at least two from among AWS, Azure & GCP). Experience in multi-cloud data platform migration and hands-on experience working with AWS, AZURE GCP Experience in Data & Analytics projects is a must. Data modeling experience relational and dimensional with consumption requirements (reporting, dashboarding, and analytics). Thorough understanding and application of AWS services related to Cloud data platform and Datalake implementation S3 Datalake, AWS EMR, AWS Glue, Amazon Redshift, AWS Lambda, and Step functions with file formats such as Parquet, Avro, and Iceberg. Must know the key tenets of architecting and designing solutions on AWS and Azure Clouds. Expertise and implementation experience in data-specific areas, such as AWS Data Lake, Data Lakehouse Architecture, and Azure Synapse and SQL Datawarehouse. Apply technical knowledge to architect and design solutions that meet business and IT needs, create Data & Analytics roadmaps, drive POCs and MVPs, and ensure the long-term technical viability of new deployments, infusing key Data & Analytics technologies where applicable. Be the Voice of the Customer to share insights and best practices, connect with the Engineering team to remove key blockers, and drive migration solutions and implementations. Familiarity with tools like DBT, Airflow, and data test automation. MUST have experience with Python/ PySpark/Scala in Big Data environments. Strong skills in SQL queries in Big Data tools such as Hive, Impala, Presto. Experience working with and extracting value from large, disconnected, and/or unstructured datasets. Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency, and workload management. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Posted 3 months ago
8 - 15 years
14 - 18 Lacs
Bengaluru
Work from Office
Lead Data architects lead the design and implementation of data collection, storage, transformation, orchestration (movement) and consumption to achieve optimum value from data. They are the technical leaders within data delivery teams. They play a key role in modelling data for optimal reuse, interoperability, security, and accessibility as well as in the design of efficient ingestion and transformation pipelines. They ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. And they instill trust through the employment of data quality frameworks and tools. The data architect at Chevron predominantly works within the Azure Data Analytics Platform, but they are not limited to it. The Senior Data architect is responsible for optimizing costs for delivering data. They are also responsible for ensuring compliance to enterprise standards and are expected to contribute to the evolution of those standards resulting from changing technologies and best practices. Key Responsibilities: Design and overseeing the entire data architecture strategy. Mentor junior data architects to ensure skill development in alignment with the team strategy. Design and implement complex scalable, high-performance data architectures that meet business requirements. Model data for optimal reuse, interoperability, security, and accessibility. Develop and maintain data flow diagrams, and data dictionaries. Collaborate with stakeholders to understand data needs and translate them into technical solutions. Ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. Ensure data quality, integrity, and security across all data systems. Qualifications: Experience in Erwin, Azure Synapse, Azure Databricks, Azure DevOps, SQL, Power BI, Spark, Python, R. Ability to drive business results by building optimal cost data landscapes. Familiarity with Azure AI/ML Services, Azure Analytics: Event Hub, Azure Stream Analytics, Scripting: Ansible Experience with machine learning and advanced analytics. Familiarity with containerization and orchestration tools (e. g. , Docker, Kubernetes). Understanding of CI/CD pipelines and automated testing frameworks. Certifications such as AWS Certified Solutions Architect , IBM certified data architect or similar are a plus. Azure Synap, Sql, Spark, Python, Powerbi, Azure Devops, Azure Data Brick
Posted 3 months ago
10 - 20 years
15 - 30 Lacs
Pune
Hybrid
About GSPANN : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. GSPANN is looking for a Data Architect. As we march ahead on a tremendous growth trajectory, we seek passionate and talented professionals to join our growing family. Job Position - Data Architect Experience Required- 8+ Years Location - Gurugram / Hyderabad/ Pune Technical Skill - Data Architect, ETL, Data Engineer, Python, SQL and Any Cloud Role and Responsibilities Develop architectural strategies for data modeling, design and implementation to meet stated requirements for metadata management, master data management, Data warehouses, ETL and ELT. Analyzing business requirements, designing scalable/ robust data models, documenting conceptual, logical & physical data model design, helping developers in development/ creating DB structures and supporting developers throughout the project life cycle. Lead and Mentor Data Engineers: This role will be responsible for leading and developing a team of data engineers focused on the growth in the team's skills and ability to execute as a team using DevOps and Data Ops principles. Investigate new technologies, data modelling methods and information management systems to determine which ones should be incorporated onto data architectures, and develop implementation timelines and milestones. Recognizes and resolves conflicts between models, ensuring that information and data models are consistent with the ecosystem model (e.g., entity names, relationships and definitions). Participates in the design of the information architecture: supports projects, reviews information elements including models, glossary, flows, data usage. Provides guidance to the team in achieving the project goals/milestones. Works independently within broad guidelines and policies, with guidance in only the most complex situations. Contribute as an expert to multiple delivery teams, defining best practices, building reusable design & components, capability building, aligning industry trends and actively engaging with wider data communities. Required Skills Graduate or post graduate in Computer science/Electronics/Software engineering. 6+ years of relevant experience in Data modelling for DW & analytics applications (OLAP) / Database related technologies. Expert data modelling skills (i.e. conceptual, logical and physical model design, experience with Enterprise Data Warehouses and Data Marts). Solid understanding of cloud database technologies and services (eg..GCP, Redshift, Aurora, DynamoDB, etc) Experience in working with data governance, data quality, and data security teams. Experienced in knowledge-driven data processing techniques like data curation, representation, standardization, normalization and any other type of processing that prepares the data for integration, persistence, analysis, exchange/share and so forth. Experience in handling very large DBs and large data volumes Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include ETL/ELT, data replication/CDC, message-oriented data movement and upcoming data ingestion and integration technologies such as stream data integration and data virtualization. Strong experience in working with and optimizing existing ETL processes and data integration and data preparation flows and helping to move them in production. Knowledge with popular data discovery, analytics, and BI software tools like MicroStrategy, Tableau, Power BI and others for semantic-layer-based data discovery. Ability to lead and mentor teams for effective delivery Crisp and effective executive communication skills, including significant experience presenting cross-functionally and across all levels.
Posted 3 months ago
12 - 15 years
14 - 18 Lacs
Chennai, Pune, Delhi
Work from Office
Data Architect / Developer Role - 12-15 years of total experience At least 10 years of experience with Any ETL tool, e.g. Oracle At least 2 years of experience with Ab Initio Data Quality, Catalog, and Governance Excellent SQL scripting and PL/SQL skills Excellent verbal and written communication Experience of managing business stakeholders Good to have - Hands on experience with AWS services like Glue, RDS, Redshift, S3, Athena Experience in dimensional modelling Knowledge of CI/CD implementation using AWS native services
Posted 3 months ago
8 - 10 years
27 - 32 Lacs
Bengaluru
Work from Office
2. Job Title : Senior Data Modeler// Experience : 8-10+ years // Domain : Healthcare // Job Location : Onsite (Hyderabad/ Bangalore/ Pune/ chandigarh) // Per day rate: INR 12,000 - INR 14,000 per day (Max) Skills & Qualifications: Experience: 8-10+ years in data modeling with a strong focus on healthcare data. Proven experience in data profiling and data quality assessments. Strong conceptual data modeling skills, with the ability to translate business requirements into well-structured data models. Experience with any data modeling tools (e.g., Erwin, IBM Infosphere Data Architect, Oracle Data Modeler, etc.) as long as the candidate can demonstrate the ability to apply logical data design. Expertise in healthcare data, including patient data, claims data, clinical data, or other healthcare-specific datasets. Strong knowledge of relational and dimensional data modeling. Experience working with data integration tools, ETL processes, and databases. Understanding of data governance, data security, and compliance, especially in the healthcare domain. Excellent communication skills for liaising with business and technical teams. ",
Posted 3 months ago
8 - 13 years
15 - 20 Lacs
Mumbai
Work from Office
[{"1":"Proven experience (X years) in designing and implementing cloud-based data architectures in a large-scale environment."},{"2":"Expertise in cloud platforms such as AWS (preferred), Azure, or Google Cloud, including services like S3, Redshift, Big Query, etc."},{"3":"Strong proficiency in data modelling, ETL development, and optimizing SQL/NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB)."},{"4":"Familiarity with big data technologies and frameworks (e.g., Hadoop, Spark) for processing large datasets."},{"5":"Deep understanding of data governance, security, and compliance requirements in cloud environments."},{"6":"Ability to effectively communicate complex technical concepts to non-technical stakeholders and collaborate within cross-functional teams."},{"7":"Demonstrated problem-solving skills and attention to detail in architecting robust data solutions that meet scalability, performance, and reliability objectives."}] [{"1":"Designing and implementing scalable, secure, and efficient cloud-based data architectures, leveraging platforms such as AWS, Azure, or Google Cloud."},{"2":"Developing comprehensive strategies for data acquisition, storage, and transformation from heterogeneous sources (e.g., structured, semi-structured, unstructured data)."},{"3":"Ensuring data solutions align with business objectives and comply with industry standards and best practices."} , {"4":"Collaborating closely with business stakeholders, data engineers, and software developers to integrate data solutions seamlessly across various systems."},{"5":"Optimizing data pipelines and workflows to maximize performance, reliability, and cost-efficiency."} , {"6":"Conducting thorough data assessments, identifying bottlenecks, and providing actionable recommendations for improvement."}] FutureSphere is in need of a talented Product Designer with more tha 3 years of past experience in tech sector.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2