Jobs
Interviews

1808 Data Architecture Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

25 - 40 Lacs

Pune

Work from Office

Job Title: Technical Lead Python Developer Location: Kalyani Nagar, Pune Shift Timing: 3:00 PM to 12:00 AM IST About the Role: We are looking for a highly skilled and passionate Python Technical Lead to join our growing team at AM Infoweb. This role is purely backend and hands-on coding focused , and ideal for someone who is not only technically strong but also has the capability to design, architect, and lead backend projects. As a Technical Lead, you'll take ownership of the backend architecture, manage complex data-driven systems, and guide the team through technical challenges using best-in-class tools and practices. Key Responsibilities: Lead backend development efforts using Python and associated frameworks. Design, develop, and maintain scalable backend architecture for web and AI-based applications. Architect solutions around given datasets and business logic. Guide junior developers, conduct code reviews, and ensure best coding practices. Collaborate with cross-functional teams including AI engineers, frontend developers, DevOps, and product owners. Work with CI/CD pipelines to ensure rapid and stable deployment cycles. Integrate and manage databases such as MySQL, PostgreSQL, and MongoDB. Deploy, monitor, and maintain services on AWS and Azure cloud platforms. Technical Requirements: Strong experience with Python backend frameworks Django, Flask, FastAPI (at least two). Experience with relational and non-relational databases MySQL, PostgreSQL, MongoDB. Solid understanding of API development , RESTful services , and system architecture. Experience working with cloud platforms AWS, Azure. Familiarity with CI/CD tools and DevOps practices . Strong problem-solving and communication skills. Preferred Qualifications: 6+ years of experience in backend development. 12 years in a technical leadership role. Exposure to AI and machine learning integrations is a plus. Experience with microservices and containerization (Docker, Kubernetes) is an advantage. What You Get: International exposure to global projects and clients. Work with trending AI tools and intelligent bots . Onsite cafeteria and break facilities. Fun and engaging team events and celebrations. * Know your organization - https://www.aminfoweb.in/ * Know your workspace! - https://www.youtube.com/watch?v=T1UKFelepCk * Our Annual RNR'2022 - https://www.youtube.com/watch?v=T1UKFelepCk * Exploring the myths surrounding outsourced healthcare management - https://www.youtube.com/watch?v=fwf3jFa2T-A

Posted 2 months ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Data Engineer Purpose: Over 15 years, we have become a premier global provider of multi-cloud management, cloud-native application development solutions, and strategic end-to-end digital transformation services. Headquartered in Canada and with regional headquarters in the U.S. and the United Kingdom, Centrilogic delivers smart, streamlined solutions to clients worldwide. We are looking for a passionate and experienced Data Engineer to work with our other 70 Software, Data and DevOps engineers to guide and assist our clients data modernization journey. Our team works with companies with ambitious missions - clients who are creating new, innovative products, often in uncharted markets. We work as embedded members and leaders of our clients development and data teams. We bring experienced senior engineers, leading-edge technologies and mindsets, and creative thinking. We show our clients how to move to the modern frameworks of data infrastructures and processing, and we help them reach their full potential with the power of data. In this role, youll be the day-to-day primary point of contact with our clients to modernize their data infrastructures, architecture, and pipelines. Principal Responsibilities: Consulting clients on cloud-first strategies for core bet-the-company data initiatives Providing thought leadership on both process and technical matters Becoming a real champion and trusted advisor to our clients on all facets of Data Engineering Designing, developing, deploying, and supporting the modernization and transformation of our client s end-to-end data strategy, including infrastructure, collection, transmission, processing, and analytics Mentoring and educating clients teams to keep them up to speed with the latest approaches, tools and skills, and setting them up for continued success post-delivery Required Experience and Skills: Must have either Microsoft Certified Azure Data Engineer Associate or Fabric Data Engineer Associate certification. Must have experience working in a consulting or contracting capacity on large data management and modernization programs. Experience with SQL Servers, data engineering, on platforms such as Azure Data Factory, Databricks, Data Lake, and Synapse. Strong knowledge and demonstrated experience with Delta Lake and Lakehouse Architecture. Strong knowledge of securing Azure environment, such as RBAC, Key Vault, and Azure Security Center. Strong knowledge of Kafka and Spark and extensive experience using them in a production environment. Strong and demonstrable experience as DBA in large-scale MS SQL environments deployed in Azure. Strong problem-solving skills, with the ability to get to the route of an issue quickly. Strong knowledge of Scala or Python. Strong knowledge of Linux administration and networking. Scripting skills and Infrastructure as Code (IaC) experience using PowerShell, Bash, and ARM templates. Understanding of security and corporate governance issues related with cloud-first data architecture, as well as accepted industry solutions. Experience in enabling continuous delivery for development teams using scripted cloud provisioning and automated tooling. Experience working with Agile development methodology that is fit for purpose. Sound business judgment and demonstrated leadership

Posted 2 months ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Not Applicable Specialism SAP & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. & Summary We are looking for a seasoned AWS Full Stack Responsibilities Lead the design, development, and deployment of fullstack applications using AWS services such as Lambda, EC2, S3, and RDS. Strong proficiency in frontend technologies such as React, Angular, or Vue.js. Expertise in backend development using .NET,Node.js, Python, Java. Collaborate with product managers, UX designers, and other stakeholders to gather requirements and translate them into technical solutions. Mandatory skill sets (AWS, Azure, GCP) services such as GCP BigQuery, Dataform AWS Redshift, Python Preferred skill sets Devops Years of experience required 812 Years Education qualification BE/B.Tech/MBA/MCA/M.Tech Education Degrees/Field of Study required Bachelor of Technology, Master Degree Computer Applications Degrees/Field of Study preferred Required Skills AWS Devops Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Travel Requirements Government Clearance Required?

Posted 2 months ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

We are seeking a seasoned Senior Data Engineer to join our Marketing Data Platform team. This role is pivotal in designing, building, and optimizing scalable data pipelines and infrastructure that support our marketing analytics and customer engagement strategies. The ideal candidate will have extensive experience with big data technologies, cloud platforms, and a strong understanding of marketing data dynamics. Data Pipeline Development & Optimization Design, develop, and maintain robust ETL/ELT pipelines using Apache PySpark on GCP services like Dataproc and Cloud Composer. Ensure data pipelines are scalable, efficient, and reliable to handle large volumes of marketing data. Data Warehousing & Modeling Implement and manage data warehousing solutions using BigQuery, ensuring optimal performance and cost-efficiency. Develop and maintain data models that support marketing analytics and reporting needs. Collaboration & Stakeholder Engagement Work closely with marketing analysts, data scientists, and cross-functional teams to understand data requirements and deliver solutions that drive business insights. Translate complex business requirements into technical specifications and data architecture. Data Quality & Governance Implement data quality checks and monitoring to ensure the accuracy and integrity of marketing data. Adhere to data governance policies and ensure compliance with data privacy regulations. Continuous Improvement & Innovation Stay abreast of emerging technologies and industry trends in data engineering and marketing analytics. Propose and implement improvements to existing data processes and infrastructure Years of Experience 5 Years in Data Engineer space Education Qualification & Certifications B.Tech or MCA Experience Proven experience with Apache PySpark, GCP (including Dataproc, BigQuery, Cloud Composer), and data pipeline orchestration. Technical Skills Proficiency in SQL and Python. Experience with data modeling, ETL/ELT processes, and data warehousing concepts.

Posted 2 months ago

Apply

10.0 - 12.0 years

35 - 40 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The Director for Data Architecture and Solutions will lead Amgen s enterprise data architecture and solutions strategy, overseeing the design, integration, and deployment of scalable, secure, and future-ready data systems. This leader will define the architectural vision and guide a high-performing team of architects and technical experts to implement data and analytics solutions that drive business value and innovation. This role demands a strong blend of business acumen, deep technical expertise, and strategic thinking to align data capabilities with the companys mission and growth. The Director will also serve as a key liaison with executive leadership, influencing technology investment and enterprise data direction . Roles & Responsibilities: Develop and own the enterprise data architecture and solutions roadmap, aligned with Amgen s business strategy and digital transformation goals. Provide executive leadership and oversight of data architecture initiatives across business domains (R&D, Commercial, Manufacturing, etc.). Lead and grow a high-impact team of data and solution architects. Coach, mentor, and foster innovation and continuous improvement in the team. Design and promote modern data architectures (data mesh, data fabric, lakehouse etc.) across hybrid cloud environments and enable for AI readiness. Collaborate with stakeholders to define solution blueprints, integrating business requirements with technical strategy to drive value. Drive enterprise-wide adoption of data modeling, metadata management, and data lineage standards. Ensure solutions meet enterprise-grade requirements for security, performance, scalability, compliance, and data governance. Partner closely with Data Engineering, Analytics, AI/ML, and IT Security teams to operationalize data solutions that enable advanced analytics and decision-making. Champion innovation and continuous evolution of Amgen s data and analytics landscape through new technologies and industry best practices. Communicate architectural strategy and project outcomes to executive leadership and other non-technical stakeholders. Functional Skills: Must-Have Skills: 10+ years of experience in data architecture or solution architecture leadership roles, including experience at the enterprise level. Proven experience leading architecture strategy and delivery in the life sciences or pharmaceutical industry. Expertise in cloud platforms (AWS, Azure, or GCP) and modern data technologies (data lakes, APIs, ETL/ELT frameworks). Strong understanding of data governance, compliance (e.g., HIPAA, GxP), and data privacy best practices. Demonstrated success managing cross-functional, global teams and large-scale data programs. Experience with enterprise architecture frameworks (TOGAF, Zachman, etc.). Proven leadership skills with a track record of managing and mentoring high-performing data architecture teams. Good-to-Have Skills: Master s or doctorate in Computer Science, Engineering, or related field. Certifications in cloud architecture (AWS, GCP, Azure). Experience integrating AI/ML solutions into enterprise Data Achitecture. Familiarity with DevOps, CI/CD pipelines, and Infrastructure as Code (Terraform, CloudFormation). Scaled Agile or similar methodology experience. Leadership and Communication Skills: Strategic thinker with the ability to influence at the executive level. Strong executive presence with excellent communication and storytelling skills. Ability to lead in a matrixed, global environment with multiple stakeholders. Highly collaborative, proactive, and business-oriented mindset. Strong organizational and prioritization skills to manage complex initiatives. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Basic Qualifications: Doctorate degree and 2 years of Information Systems experience, or Master s degree and 6 years of Information Systems experience, or Bachelor s degree and 8 years of Information Systems experience, or Associates degree and 10 years of Information Systems experience, or 4 years of managerial experience directly managing people and leadership experience leading teams, projects, or programs. .

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Job TItle: Microsoft Fabric Data Engineer Location: Bangalore Job Type: Conract (24 Months) Job Description: We are seeking a highly skilled and experienced Microsoft Fabric Data Engineer/Architect to design, develop, and maintain robust, scalable, and secure data solutions within the Microsoft Fabric ecosystem. This role will leverage the full suite of Microsoft Azure data services, including Azure Data Bricks, Azure Data Factory, and Azure Data Lake, to build end-to-end data pipelines, data warehouses, and data lakehouses that enable advanced analytics and business intelligence. Required Skills & Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 5+ years of experience in data architecture and engineering, with a strong focus on Microsoft Azure data platforms. Proven hands-on expertise with Microsoft Fabric and its components, including: OneLake Data Factory (Pipelines, Dataflows Gen2) Synapse Analytics (Data Warehousing, SQL analytics endpoint) Lakehouses and Warehouses Notebooks (PySpark) Extensive experience with Azure Data Bricks, including Spark development (PySpark, Scala, SQL). Strong proficiency in Azure Data Factory for building and orchestrating ETL/ELT pipelines. Deep understanding and experience with Azure Data Lake Storage Gen2. Proficiency in SQL (T-SQL, Spark SQL), Python, and/or other relevant scripting languages. Solid understanding of data warehousing concepts, dimensional modeling, and data lakehouse architectures. Experience with data governance principles and tools (e.g., Microsoft Purview). Familiarity with CI/CD practices, version control (Git), and DevOps for data pipelines. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in a fast-paced, agile environment. Preferred Qualifications: Microsoft certifications in Azure Data Engineering (e.g., DP-203, DP-600: Microsoft Fabric Analytics Engineer Associate). Experience with Power BI for data visualization and reporting. Familiarity with real-time analytics and streaming data processing. Exposure to machine learning workflows and integrating ML models with data solutions

Posted 2 months ago

Apply

10.0 - 15.0 years

6 - 10 Lacs

Jaipur

Work from Office

ABOUT HAKKODA Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!. We are seeking a highly skilled and experienced AWS Administrator to join a long-term project (12+ months), fully allocated and 100% hands-on. This role will backfill a senior AWS Admin with 10-15 years of experience and requires deep technical capability across AWS infrastructure service. This is not a team leadership role the ideal candidate will operate independently, take full ownership of AWS administration tasks, and contribute directly to maintaining and optimizing cloud operations. Role & Responsibilities AWS Infrastructure Management: Provision, configure, and maintain AWS services such as EC2, S3, IAM, VPC, Lambda, RDS, CloudWatch, CloudTrail, and more. Monitoring & Incident Response: Set up monitoring, logging, and alerting solutions. Respond to and resolve infrastructure issues proactively. Security & IAM: Manage IAM roles, policies, and user access with a strong focus on security best practices and compliance requirements. Automation & Scripting: Automate routine tasks using scripting (Bash, Python) and AWS CLI/SDK. Infrastructure as Code (IaC): Use tools like Terraform or CloudFormation to manage and automate infrastructure deployments and changes. Cost Optimization: Monitor resource usage and implement cost-control strategies to optimize AWS spending. Backup & Disaster Recovery: Manage backup strategies and ensure systems are resilient and recoverable. Documentation: Maintain detailed and up-to-date documentation of AWS environments, standard operating procedures, and runbooks. Skils & Qualifications 10+ years of hands-on AWS administration experience. Strong understanding of AWS core services (EC2, S3, IAM, VPC, Lambda, RDS, etc.). Experience with scripting (Python, Bash, or PowerShell) and automation tooling. Proven expertise in using Terraform or CloudFormation . Deep knowledge of IAM policy creation and security best practices. Experience with monitoring tools such as CloudWatch, Prometheus, or third-party APM tools. Familiarity with CI/CD pipelines and DevOps principles. Strong troubleshooting skills with the ability to resolve complex infrastructure issues independently. Excellent communication skills with the ability to work effectively with remote teams. Comfortable working during US Eastern Time zone hours. Preferred Qualifications: AWS Certifications (e.g., SysOps Administrator Associate , Solutions Architect Associate/Professional ). Experience in hybrid environments or with other cloud platforms (Azure, GCP). Familiarity with Snowflake, GitLab, or similar DevOps tooling. Benefits: - Health Insurance - Paid leave - Technical training and certifications - Robust learning and development opportunities - Incentive - Toastmasters - Food Program - Fitness Program - Referral Bonus Program Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive. Ready to take your career to the next level? Apply today and join a team that s shaping the future!! Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here.

Posted 2 months ago

Apply

5.0 - 10.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees passion for travel and ensure a rewarding career journey. We re building a more open world. Join us. Data Engineer III Introduction to the Team Expedia Technology teams partner with our Product teams to create innovative products, services, and tools to deliver high-quality experiences for travelers, partners, and our employees. A singular technology platform powered by data and machine learning provides secure, differentiated, and personalized experiences that drive loyalty and traveler satisfaction. Expedia Group is seeking a skilled and motivated Data Engineer III to join our Finance Business Intelligence team supporting the Product & Technology Finance organization. In this role, you will help drive data infrastructure and analytics solutions that support strategic financial planning, reporting, and operational decision-making across the Global Finance community. You ll work closely with Finance and Technology partners to ensure data accuracy, accessibility, and usability in support of Expedia s business objectives. As a Data Engineer III, you have strong experience working with a variety of datasets, data environments, tools, and analytical techniques. You enjoy a fun, collaborative and stimulating team environment. Successful candidates should be able to own projects end-to-end, including identifying problems and solutions, building and maintain data pipelines and dashboards, distilling key insights and communicate to stakeholders. In this role, you will: Develop new and improve existing end to end Business Intelligence products (data pipelines, Tableau dashboards, and Machine Learning predictive forecasting models). Drive internal efficiencies through streamline code/documentation/Tableau development to maintain high data integrity. Troubleshoot and resolve production issues with the team products (automation opportunities, optimizations, back-end data issues, data reconciliations). Proactively reach out to subject matter experts /stakeholders and collaborate to solve problems. Respond to ad hoc data requests and conduct analysis to provide valuable insights to stakeholders. Collaborate and coordinate with team members/stakeholders to translate complex data into meaningful insights, that improve the analytical capabilities of the business. Apply knowledge of database design to support migration of data pipelines from on prem to cloud environment (including data extraction, ingestion, processing of large data sets) Support dashboard development on cloud environment to enable self-service reporting. Communicate clearly on current work status and design considerations Think broadly and comprehend the how, why, and what behind data architecture designs Experience & Qualifications: Bachelor s in Computer Science, Mathematics, Statistics, Information Systems, or related field 5+ years experience in a Data Analyst, Data Engineer or Business Analyst role Proven expertise in SQL, with practical experience utilizing query engines including SQL Server, Starburst, Trino, Querybook and data science tools such as Python/R, SparkSQL. Proficient visualization skills (Tableau, Looker, or similar) and excel modeling/report automation. Exceptional understanding of relational and dimensional datasets, data warehouse and data mining and applies database design principles to solve data requirements Experience building robust data extract, load and transform (ELT) processes, that source data from multiple databases. Demonstrated record of defining and executing key analysis and solving problems with minimal supervision. Dynamic individual contributor who consistently enhances operational playbooks to address business problems. 3+ year working in a hybrid environment that uses both on-premise and cloud technologies is preferred. Experience working in an environment that manipulates large datasets on the cloud platform preferred. Background in analytics, finance or a comparable reporting and analytics role preferred. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request . We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Groups family of brands includes: Brand Expedia , Hotels.com , Expedia Partner Solutions, Vrbo , trivago , Orbitz , Travelocity , Hotwire , Wotif , ebookers , CheapTickets , Expedia Group Media Solutions, Expedia Local Expert , CarRentals.com , and Expedia Cruises . 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. . Never provide sensitive, personal information to someone unless you re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs . Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

Posted 2 months ago

Apply

1.0 - 13.0 years

13 - 14 Lacs

Pune

Work from Office

Design, develop, and maintain scalable data solutions using Starburst. Collaborate with cross-functional teams to integrate Starburst with existing data sources and tools. Optimize query performance and ensure data security and compliance. Implement monitoring and alerting systems for data platform health. Stay updated with the latest developments in data engineering and analytics. Skills Must have Bachelors degree or Masters in a related technical field; or equivalent related professional experience. Prior experience as a Software Engineer applying new engineering principles to improve existing systems including leading complex, well defined projects. Strong knowledge of Big-Data Languages including: SQL Hive Spark/Pyspark Presto Python Strong knowledge of Big-Data Platforms, such as:o The Apache Hadoop ecosystemo AWS EMRo Qubole or Trino/Starburst Good knowledge and experience in cloud platforms such as AWS, GCP, or Azure. Continuous learner with the ability to apply previous experience and knowledge to quickly master new technologies. Demonstrates the ability to select among technology available to implement and solve for need. Able to understand and design moderately complex systems. Understanding of testing and monitoring tools. Ability to test, debug, fix issues within established SLAs. Experience with data visualization tools (e.g., Tableau, Power BI). Understanding of data governance and compliance standards. Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114886 Data Science BCM Industry 05/06/2025 Req. VR-114886 Apply for Starburst Engineer in Pune *

Posted 2 months ago

Apply

3.0 - 5.0 years

9 - 10 Lacs

Pune

Work from Office

Design, develop and maintain/support Power BI workflows to take data from multiple sources to make it ready for analytics and reporting. Optimize existing workflows to ensure performance, scalability and reliability. Support the automation of manual processes to improve operational efficiency. Document workflows, processes, and best practices for knowledge sharing. Provide training and mentorship to other team members on Alteryx development. Collaborate with other members of the team to deliver data solutions for the program. Skills Must have Proficiency in Power BI Desktop, Power BI Service (5+ yrs of experience) Experience with creating interactive dashboards, custom visuals, and reports. Data Modeling: Strong understanding of data modeling concepts, including relationships, calculated columns, measures, and hierarchies. Expertise in using DAX (Data Analysis Expressions) for complex calculations. SQL and Database Management: Proficiency in SQL to extract, manipulate, and analyze data from databases. Knowledge of database design and querying. ETL (Extract, Transform, Load) Tools: Experience with data transformation and cleaning using tools like Power Query, SSIS, or other ETL tools. Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114885 Data Science BCM Industry 05/06/2025 Req. VR-114885 Apply for Power BI Developer in Pune *

Posted 2 months ago

Apply

5.0 - 10.0 years

13 - 15 Lacs

Pune

Work from Office

Design, build, and manage data pipelines using Azure Data Integration Services (Azure DataBricks, ADF, Azure Functions.) Collaborate closely with the security team to develop robust data solutions that support our security initiatives. Implement, monitor, and optimize data processes, ensuring adherence to security and data governance best practices. Troubleshoot and resolve data-related issues, ensuring data quality and accessibility. Develop strategies for data acquisitions and integration of the new data into our existing architecture. Document procedures and workflows associated with data pipelines, contributing to best practices. Share knowledge about latest Azure Data Integration Services trends and techniques. Implement and manage CI/CD pipelines to automate data and UI testcases and integrate testing with development pipelines. Implement and manage CI/CD pipelines to automate development and integrate test pipelines. Conduct regular reviews of the system, identify possible security risks, and implement preventive measures. Skills Must have Excellent command of English Bachelors or Masters degree in Computer Science, Information Technology, or related field. 5+ years of experience in data integration and pipeline development using Azure Data Integration Services including Azure Data Factory and Azure Databricks. Hands-on with Python and Spark Strong understanding of security principles in the context of data integration. Proven experience with SQL and other data query languages. Ability to write, debug, and optimize data transformations and datasets. Extensive experience in designing and implementing ETL solutions using Azure Databricks, Azure Data Factory or similar technologies. Familiar with automated testing frameworks using Squash Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114884 Data Science BCM Industry 05/06/2025 Req. VR-114884 Apply for Databricks Developer in Pune *

Posted 2 months ago

Apply

10.0 - 15.0 years

8 - 13 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role Overview We are looking for a Data & Analytics Subject Matter Expert with deep expertise in Data Engineering, Business Intelligence (BI), and AWS cloud ecosystems . This role demands strategic thinking, hands-on execution, and collaboration across technical and business teams to deliver impactful data-driven solutions. Key Responsibilities 1. Data Architecture & Engineering Design and implement scalable, high-performance data solutions on AWS. Build robust data pipelines, ETL/ELT workflows, and data lake architectures. Enforce data quality, security, and governance practices. 2. Business Intelligence & Insights Develop interactive dashboards and visualizations using Power BI, Tableau, or QuickSight. Define data models and KPIs to support data-driven decision-making. Collaborate with business teams to extract insights that drive action. 3. Cloud & Advanced Analytics Deploy data warehousing solutions using Redshift, Glue, S3, Athena, and other AWS services. Optimize storage and processing strategies for performance and cost-efficiency. Explore AI/ML integrations for predictive and advanced analytics (preferred). 4. Collaboration & Best Practices Partner with cross-functional teams (engineering, data science, business) to align on data needs. Champion best practices in data governance, compliance, and architecture. Translate business requirements into scalable technical solutions. Required Qualifications Education Bachelor s or Master s in Computer Science, Information Technology, Data Science, or related discipline. Experience 10+ years of experience in data engineering, BI, and analytics domains. Proven experience with AWS data tools and modern data architectures. Technical Skills Strong command of AWS services: Redshift, Glue, S3, Athena, Lambda, Kinesis. Proficient in SQL, Python, or Scala. Experience building and maintaining ETL/ELT workflows and data models. Expertise in BI tools like Power BI, Tableau, QuickSight, or Looker. Familiarity with AI/ML models and frameworks is a plus. Certifications Preferred: AWS Certified Data Analytics - Specialty. Additional certifications in AWS, data engineering, or analytics are a plus. Why Join Trianz Join a high-growth, innovation-led firm delivering transformation at scale. Collaborate with global teams on cutting-edge cloud and analytics projects. Enjoy a competitive compensation structure and clear career progression pathways.

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Gurugram

Work from Office

Data Science Architect (Full Stack) Location: Gurugram, India (On-site/Hybrid) Type: Full-Time | 4+ Years Experience | AI, Architecture & Product Engineering Hubnex Labs is seeking a visionary and hands-on Full Stack Data Science Architect to lead the development of scalable AI products and reusable intellectual property (IP) that power data-driven solutions across global enterprise clients. This role requires deep technical expertise in AI/ML, data architecture, backend/frontend systems, and cloud-native technologies. Key Responsibilities AI & Data Science Leadership Lead design and development of end-to-end AI/ML solutions across enterprise applications Architect data pipelines, model training, validation, and deployment workflows Apply cutting-edge techniques in NLP, Computer Vision, Speech Recognition, Reinforcement Learning , etc. Evaluate and rank algorithms based on business impact, accuracy, and scalability Design and optimize data augmentation, preprocessing, and feature engineering pipelines Train, validate, and fine-tune models using state-of-the-art tools and strategies Monitor and improve model performance post-deployment Full Stack & Cloud Architecture Design and implement cloud-native systems using microservices , serverless , and event-driven architectures Build robust APIs and UIs for intelligent applications (using Python, Node.js, React, etc.) Use Docker , Kubernetes , and CI/CD pipelines for scalable deployment Leverage technologies like Kafka, TensorFlow, Elixir, Golang , and NoSQL/Graph DBs for high-performance ML products Define infrastructure to meet latency and throughput goals for ML systems in production Innovation & Productization Build reusable IP that can be adapted across industries and clients Rapidly prototype AI features and user-facing applications for demos and validation Collaborate closely with product managers and business stakeholders to translate use cases into scalable tech Explore and adopt new technologies and frameworks to maintain a forward-looking tech stack Required Skills & Experience 4+ years of experience building and deploying AI/ML models and scalable software systems Strong understanding of ML frameworks (TensorFlow, Keras, PyTorch), data libraries (pandas, NumPy), and model tuning Proven track record of working with large-scale data , data cleaning, and visualization Expertise in Python , and experience with at least one other language (Go, Java, Scala, etc.) Experience with front-end frameworks (React, Vue, or Angular) is a plus Proficient in DevOps practices , CI/CD, and cloud platforms (AWS/GCP/Azure) Familiarity with event-driven systems , real-time protocols (WebSockets, MQTT), and container orchestration Hands-on experience with NoSQL databases , data lakes , or distributed data platforms Preferred Traits Experience leading agile engineering teams and mentoring junior developers Strong architectural thinking, with an eye on scalability, maintainability, and performance Entrepreneurial mindset with a focus on building reusable components and IP Excellent communication skills, capable of bridging business and technical conversations Why Join Hubnex Labs? Own and architect impactful AI products used across industries Shape the data science foundation of a fast-scaling software consulting powerhouse Enjoy a creative, high-performance environment in Gurugram , with flexibility and long-term growth opportunities Contribute to next-gen solutions in AI, cloud, and digital transformation

Posted 2 months ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

Chennai, Bengaluru

Work from Office

Data Engineer (Azure) - Neoware Technology Solutions Private Limited Data Engineer (Azure) Requirements 4 - 10 years of hands-on experience in designing, developing and implementing data engineering solutions. Strong SQL development skills, including performance tuning and query optimization. Good understanding of data concepts. Proficiency in Python and a solid understanding of programming concepts. Hands-on experience with PySpark or Spark Scala for building data pipelines. Ensure data consistency and address ambiguities or inconsistencies across datasets. Understanding of streaming data pipelines for near real-time analytics. Experience with Azure services including Data Factory, Functions, Databricks, Synapse Analytics, Event Hub, Stream Analytics and Data Lake Storage. Familiarity with at least one NoSQL database. Knowledge of modern data architecture patterns and industry trends in data engineering. Understanding of data governance concepts for data platforms and analytical solutions. Experience with Git for managing version control for source code. Experience with DevOps processes, including experience implementing CI/CD pipelines for data engineering solutions. Strong analytical and problem-solving skills. Excellent communication and teamwork skills. Responsibilities Azure Certifications related to Data Engineering are highly preferred. Experience with Azure Kubernetes Service (AKS), Container Apps and API Management. Strong understanding and experience with BI/visualization tools like Power BI. Chennai, Bangalore Full time 4+ years Other positions Chennai / Bangalore / Mumbai 3+ years Principal Architect (Data and Cloud) Development

Posted 2 months ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

Chennai, Bengaluru

Work from Office

Data Engineer (AWS) - Neoware Technology Solutions Private Limited Data Engineer (AWS) Requirements 4 - 10 years of hands-on experience in designing, developing and implementing data engineering solutions. Strong SQL development skills, including performance tuning and query optimization. Good understanding of data concepts. Proficiency in Python and a solid understanding of programming concepts. Hands-on experience with PySpark or Spark Scala for building data pipelines. Understanding of streaming data pipelines for near real-time analytics. Experience with Azure services including Data Factory, Functions, Databricks, Synapse Analytics, Event Hub, Stream Analytics and Data Lake Storage. Familiarity with at least one NoSQL database. Knowledge of modern data architecture patterns and industry trends in data engineering. Understanding of data governance concepts for data platforms and analytical solutions. Experience with Git for managing version control for source code. Experience with DevOps processes, including experience implementing CI/CD pipelines for data engineering solutions. Strong analytical and problem-solving skills. Excellent communication and teamwork skills. Responsibilities Azure Certifications related to Data Engineering are highly preferred. Experience with Amazon AppFlow, EKS, API Gateway, NoSQL database services. Strong understanding and experience with BI/visualization tools like Power BI. Chennai, Bangalore Full time 4+ years Other positions Principal Architect (Data and Cloud) Development

Posted 2 months ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Gurugram

Work from Office

Manager, Business Intelligence & Reporting Gurugram/Bangalore, India AXA XL recognizes digital, data and information assets are critical for the business, both in terms of managing risk and enabling new business opportunities This data should not only be high quality, but also actionable - enabling AXA XL s executive leadership team to maximize benefits and facilitate sustained advantage Our Data, Intelligence & Analytics function is focused on driving innovation through optimizing how we leverage digital, data to drive strategy and differentiate ourselves from the competition As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and strengthen our digital reporting capabilities we are seeking a Manager, BI and Reporting In this role you will support/manage BI & reporting What you ll be DOING What will your essential responsibilities include? Be an expert and manage BI & Reporting BAU and effective stakeholder management Be and expert and manage BI & Reporting products and Impart training to users for self-service reporting Manage IDA BI & Reporting on various strategic initiatives as they arise and enable the development of the function and related capabilities Oversee the design/production/change for BI reporting Contribute to best practices and standards to ensure a consistent approach to BI reporting Raising data gaps to the IDA Data Quality team so that accurate information flows in our Data Architecture Partner with key areas within IDA, GT, and across business stakeholders for any BI requirement Instill a customer-first culture, prioritizing service for our business stakeholders above all An understanding of AI will be an added advantage You will report to the Senior Delivery Lead, Business Intelligence & Reporting What you will BRING We re looking for someone who has these abilities and skills: Required Skills and Abilities: Ability to communicate well within team, peers, teams across the globe, manage stakeholders effectively Brings in a collaborative spirit, can-do attitude and a Customer First mindset A minimum of a bachelor s or master s degree in a relevant discipline Relevant years of experience in a data role (analytics or engineer) supporting multiple specialty areas of Data and Analytics Passion for digital, data and experience working within a digital and data driven organization Experience on BI tool like Power BI etc Desired Skills and Abilities: Intermediate proficiency in SQL, Advance Excel and Power BI Able to help/guide his team members on any technical issues and develop them so that the team can self-directedly manage the same Ability to lead a project/team Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks For mid-sized companies, multinationals and even some inspirational individuals we don t just provide re/insurance, we reinvent it How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty With an innovative and flexible approach to risk solutions, we partner with those who move the world forward Learn more at axaxl com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success That s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential It s about helping one another and our business to move forward and succeed Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl com / about-us / inclusion-and-diversity AXA XL is an Equal Opportunity Employer Total Rewards AXA XL s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security It provides competitive compensation and personalized, inclusive benefits that evolve as you do We re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence Sustainability At AXA XL, Sustainability is integral to our business strategy In an ever-changing world, AXA XL protects what matters most for our clients and communities We know that sustainability is at the root of a more resilient future Our 2023-26 Sustainability strategy, called Roots of resilience , focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations Our Pillars: Valuing nature: How we impact nature affects how nature impacts us Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future We re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans Addressing climate change: The effects of a changing climate are far-reaching and significant Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption Were building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions Integrating ESG: All companies have a role to play in building a more resilient future Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business We re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL s Hearts in Action programs These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving For more information, please see axaxl com/sustainability

Posted 2 months ago

Apply

16.0 - 18.0 years

50 - 60 Lacs

Gurugram

Work from Office

Join us as a Data Engineer We re looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If you re ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you Were offering this role at assistant vice president level What you ll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development. You ll also provide transformation solutions and carry out complex data extractions. We ll expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions. You ll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills you ll need To be successful in this role, you ll have an understanding of data usage and dependencies with wider teams and the end customer. You ll also have experience of extracting value and features from large scale data. We ll expect you to have experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities. You ll also need: Experience of using programming languages alongside knowledge of data and software engineering fundamentals Good knowledge of modern code development practices Great communication skills with the ability to proactively engage with a range of stakeholders Hours 45 Job Posting Closing Date: 16/06/2025

Posted 2 months ago

Apply

15.0 - 19.0 years

40 - 45 Lacs

Pune

Work from Office

Skill Name - Data Architect with Azure & Databricks + Power BIExperience: 15 - 19 years Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.

Posted 2 months ago

Apply

15.0 - 20.0 years

50 - 70 Lacs

Gurugram, Bengaluru

Work from Office

Join us as a Data & Analytics Analyst Take on a new challenge in Data & Analytics and help us shape the future of our business You ll take accountability for the analysis of complex data to identify business issues and opportunities, and supporting the delivery of high quality business solutions Were committed to mapping a career path that works for you, with a focus on helping you build new skills and engage with the latest ideas and technologies in data analytics Were offering this role at vice president level What youll do As a Data & Analytics Analyst, you ll be driving the production of high quality analytical input to support the development and implementation of innovative processes and problem resolution. You ll be capturing, validating and documenting business and data requirements, making sure they are in line with key strategic principles. We ll look to you to interrogate, interpret and visualise large volumes of data to identify, support and challenge business opportunities and identify solutions. You ll also be: Performing data extraction, storage, manipulation, processing and analysis Conducting and supporting options analysis, identifying the most appropriate solution Accountable for the full traceability and linkage of business requirements of analytics outputs Seeking opportunities to challenge and improve current business processes, ensuring the best result for the customer Creating and executing quality assurance at various stages of the project in order to validate the analysis and to ensure data quality, identify data inconsistencies, and resolve as needed Strong sense of ownership with a focus on delivering high-quality outcomes Exceptional attention to detail Emphasis on measurable outcomes and impact of work Expertise in data analytics and reporting The skills youll need You ll need a background in business analysis tools and techniques, along with the ability to influence through communications tailored to a specific audience. Additionally, you ll need the ability to use core technical skills. You ll also demonstrate: Clear and effective communication Proficiency in SQL, and tools such as Excel and Power BI Experience in Informatica, Snowflake or others Responsible for performance metrics and data solutions across the entire data architecture team Skilled in data visualization, report generation and presentation to both technical and business audiences Over 15 years of professional experience Hours 45 Job Posting Closing Date: 23/06/2025

Posted 2 months ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Chennai

Work from Office

Overview Prodapt is looking for a Data Model Architect. The candidate should be good with design and data architecture in Telecom domain. Responsibilities Deliverables Design & Document – Data Model for CMDB, P-S-R Catalogue (Product, Service and Resource management layer) Design & Document Build Interface Speciation for Data Integration. Activities Data Architecture and Modeling Design and maintain conceptual, logical, and physical data models Ensure scalability and adaptability of data models for future organizational needs. Data Model P-S-R catalogs in the existing Catalogs,SOM,COM systems CMDB Design and Management Architect and optimize the CMDB to accurately reflect infrastructure components, telecom assets, and their relationships. Define data governance standards and enforce data consistency across the CMDB. Design data integrations between across systems (e.g., OSS/BSS, network monitoring tools, billing systems). Good Communication skills. Bachelors Degree.

Posted 2 months ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Overview *Need Telecom Solution Architect/Digital Solution Designer with telecom domain experience. *Should have handson in Lead to Cash journey - Processes and Systems for Cellular Mobile service for Retail customers *Should have solid understanding on different Service Fulfilment journey metrics. *Should have solid understanding of different BSS/OSS applications that support Lead to Cash journey. *Should have strong understanding of different Channels, Product Catalog, Order Orchestration, Provisioning & Activation *Should have conducted workshops to gather requirements from stakeholders, presenting different solution options to architects. *Should have expertise in: **Designing multi-channel Enterprise solutions among BSS domains CRM, Order Management, BPM, OmniChannel implementations, TMForum Open APIs , Catalog, Intelligence platforms etc **Knowledge on Event/Data driven architecture, micro service framework and API integration **Experience in designing enterprise systems in BSS Responsibilities Mandatory to have excellent understanding of Telecom Domain, and experience of Consulting / Solution Design / Solution Architecture for Telecom Solutions (Products and Services development). Ability to analyze Telco processes, simplify them and develop solutions in alignment with architecture principles. Experience in working with business and application teams to collect requirements and deliver solutions. Experience in BSS/OSS is must, at least 5+ Years. Understands Open API or TMF API specification or API based integration with partners Define L2/L3 data model using reference data architecture like SID Developing micro services-based architecture TMF - open digital architecture or similar framework to develop channels architecture Experience of using modelling tools for architecture development As a Telecom Domain expert, you will be required to seek out the best ways to improve business processes and effectiveness through technology, strategy, analytic solutions. Brings in strong knowledge in the telecom domain, and having working experience across Business and Technology Roles, across B2B and B2C businesses. Experience with Business Analysis, Business Process Re-engineering, E2E Solution Design / Component or System Design / Solution Architecture, in Telecom OSS / BSS domain is required. Good to have working experience on Digital Transformation, OSS/BSS Cloud Implementations, Network Transformation programs. Understanding of Technology Landscape such as E2E OSS & BSS, Microservices based Architecture, RPA, Service Assurance, Billing, and Invoicing for B2C business. Should have a business focus, and an equal understanding of IT/Technology enablement, so as to be able to align the business goals/objectives to the IT delivery. Good experience in data modelling and integration patterns. Has good understanding of frameworks relevant for Telecommunications domain (for e.g. eTOM, SID, TAM, ITIL, ODA) Able to understand and translate business requirements, and working with product / technical and operations teams to help design and build new Products and Services Able to conduct workshops and design thinking sessions, to understand IT Process / Technology / Architecture and existing OSS/BSS implementations. Able to deliver intelligent business insights through translation of reports and analytics. Understanding of technologies such as 4G/5G/Metro Ethernet/ATM/MPLS/SONET/SDH etc. and their mapping to OSS systems is an added advantage. Conduct proof-of-concept (POCs) for requirements. Good communication and Presentation skills. Good to have Understanding of autonomous networks and closed loop automation Knowledge of Architecture design methodologies and Industry reference frameworks – TOGAF etc. – with a clear demonstration of the deliverables created as a result of applying the framework Knowledge of 5G & IoT Experience in any of the COTS products Experience in Network optimization techniques Understanding of intelligent applying across OSS land scape Experience of working on network technologies Knowledge of DevOps and open-source tools Experience in any one of the programming languages - Java or Python Experience in any one of the DB technologies - Oracle, MySQL, Postgres, Cassandra Working with development teams and product managers to build software solutions/web applications Bachelor's degree (in any field) is mandatory. MSc/BE /Masters with specialization in IT/Computer Science is desired. Atleast 5 years of work experience. Adaptability and experience working in multi-channel delivery projects is preferred.

Posted 2 months ago

Apply

5.0 - 8.0 years

0 - 1 Lacs

Ahmedabad

Work from Office

We are seeking an experienced Solution Architect to design scalable, high-performance technical solutions aligned with business goals. The role involves collaborating with cross-functional teams and leading architecture strategy and implementation.

Posted 2 months ago

Apply

8.0 - 10.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Job Track Description: The ETL Data Architect would be responsible for driving data migration strategy and execution within complex Enterprise landscape. This role will be responsible to bring in the best practices of data migration and integrations with Salesforce. Bring in best practices for Salesforce data migration integration. Create Data migration strategy for Salesforce implementations. Define template/uniform file format for migrating data into Salesforce. Must Skill: Data Architect with 8-10 years of ETL experience and 5+ years of Informatica Cloud (IICS, ICRT) experience. 5+ years of experience on Salesforce systems. Develop comprehensive data mapping and transformation plans to align data with Salesforce data model and software solution. Good understanding of Salesforce data model and schema builder. Excellent understanding of relational database concepts and how to best implement database objects in the Salesforce. Experience integrating large sets of data into Salesforce from multiple data sources. Experience with EDI transactions. Experience in Design and Development of ETL/Data Pipelines. Excellent understanding of SOSL and SOQL and the Salesforce Security model. Full understanding of project life cycle and development methodologies. Ability to interact with technical and functional teams. Excellent oral, written communication and presentation skills. Should be able to work in offshore / onsite model. Experience: Expert in ETL development with Informatica cloud using various connectors. Experience with Real Time integrations and Batch scripting. Expert in implementing the business rules by creating various transformations, working with multiple data sources like flat files, relational and cloud database, etc. and developing mappings. Experience in using ICS workflow tasksSession, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-session, and Pre/Post commands. Ability to migrate objects in all phases (DEV, QA/UAT and PRD) following standard defined processes. Performance analysis with large data sets Experience in writing technical specifications based on conceptual design and stated business requirements. Experience in designing and maintaining logical and physical data models and communicates to peers and junior associates using flowcharts, unified data language, Data flow Diagram. Good Knowledge of SQL, PL/SQL and Data Warehousing Concepts. Experience in using Salesforce SOQL is a plus. Responsibilities: Excellent troubleshooting and debugging skills in Informatica Cloud. Significant knowledge of PL/ SQL including tuning, triggers, ad hoc queries, and stored procedures. Strong analytical skills. Works under minimal supervision with some latitude for independent judgement. Prepare and package scripts and code across development, test, and QA environments. Participate in change control planning for production deployments. Conducts tasks and assignments as directed.

Posted 2 months ago

Apply

4.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Job Area: Miscellaneous Group, Miscellaneous Group > Data Analyst Qualcomm Overview: Qualcomm is a company of inventors that unlocked 5G ushering in an age of rapid acceleration in connectivity and new possibilities that will transform industries, create jobs, and enrich lives. But this is just the beginning. It takes inventive minds with diverse skills, backgrounds, and cultures to transform 5Gs potential into world-changing technologies and products. This is the Invention Age - and this is where you come in. General Summary: About the Team Qualcomm's People Analytics team plays a crucial role in transforming data into strategic workforce insights that drive HR and business decisions. As part of this lean but high-impact team, you will have the opportunity to analyze workforce trends, ensure data accuracy, and collaborate with key stakeholders to enhance our data ecosystem. This role is ideal for a generalist who thrives in a fast-paced, evolving environment"”someone who can independently conduct data analyses, communicate insights effectively, and work cross-functionally to enhance our People Analytics infrastructure. Why Join Us End-to-End Impact Work on the full analytics cycle"”from data extraction to insight generation"”driving meaningful HR and business decisions. Collaboration at Scale Partner with HR leaders, IT, and other analysts to ensure seamless data integration and analytics excellence. Data-Driven Culture Be a key player in refining our data lake, ensuring data integrity, and influencing data governance efforts. Professional Growth Gain exposure to multiple areas of people analytics, including analytics, storytelling, and stakeholder engagement. Key Responsibilities People Analytics & Insights Analyze HR and workforce data to identify trends, generate insights, and provide recommendations to business and HR leaders. Develop thoughtful insights to support ongoing HR and business decision-making. Present findings in a clear and compelling way to stakeholders at various levels, including senior leadership. Data Quality & Governance Ensure accuracy, consistency, and completeness of data when pulling from the data lake and other sources. Identify and troubleshoot data inconsistencies, collaborating with IT and other teams to resolve issues. Document and maintain data definitions, sources, and reporting standards to drive consistency across analytics initiatives. Collaboration & Stakeholder Management Work closely with other analysts on the team to align methodologies, share best practices, and enhance analytical capabilities. Act as a bridge between People Analytics, HR, and IT teams to define and communicate data requirements. Partner with IT and data engineering teams to improve data infrastructure and expand available datasets. Qualifications Required4-7 years experience in a People Analytics focused role Analytical & Technical Skills Strong ability to analyze, interpret, and visualize HR and workforce data to drive insights. Experience working with large datasets and ensuring data integrity. Proficiency in Excel and at least one data visualization tool (e.g., Tableau, Power BI). Communication & Stakeholder Management Ability to communicate data insights effectively to both technical and non-technical audiences. Strong documentation skills to define and communicate data requirements clearly. Experience collaborating with cross-functional teams, including HR, IT, and business stakeholders. Preferred: Technical Proficiency Experience with SQL, Python, or R for data manipulation and analysis. Familiarity with HR systems (e.g., Workday) and cloud-based data platforms. People Analytics Expertise Prior experience in HR analytics, workforce planning, or related fields. Understanding of key HR metrics and workforce trends (e.g., turnover, engagement, diversity analytics). Additional Information This is an office-based position (4 days a week onsite) with possible locations that may include India and Mexico

Posted 2 months ago

Apply

12.0 - 20.0 years

48 - 60 Lacs

Bengaluru

Work from Office

Data Architect Bangalore (Pune option). Hybrid, 2-3 days WFO, Up to 60 LPA. Needs GCP, Data Engineering, Analytics, Visualization, Modeling. 1 5-20 yrs exp, end-to-end data pipeline. Notice: 60 days. Architecture - recent 1-2 years Provident fund

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies