Jobs
Interviews
10 Job openings at HashedIn by Deloitte
Python Expert

Gurgaon, Haryana, India

4 - 8 years

Not disclosed

On-site

Full Time

Requirements 4-8 years of experience with web frameworks, preferably Python Flask, Django, FastAPI, or similar. Good knowledge of cloud-based CI/CD service offerings from cloud service providers like AWS, GCP, and Azure. Experience with other services in any of the 3 major cloud service providers, i. e. AWS, GCP, or Azure. Strong understanding of the basics of SQL - reading and writing SQL queries, a basic understanding of database interaction tools (like pgAdmin), SQL, columnar databases, vector databases, and database optimization techniques, including indexing, etc. Also, knowledge of AWS Aurora is good to have. Implementation experience with Generative AI with RAG, Agentic solutions with cloud provided/self-hosted LLM models like GPT4o, AWS Claude, Gemini, etc., is a desired skill. Good knowledge of API development and testing - including but not limited to HTTP, RESTful services, Postman, and allied cloud-based services like API Gateway. Strong coding/debugging/problem-solving abilities and should have good knowledge of Python. Good to have experience with pip, setuptools, etc. A technical background in data with a deep understanding of issues in multiple areas, such as data acquisition, ingestion, and processing, data management, distributed processing, and high availability, is required. Quality delivery is the highest priority. Should know about industry best practices and standards in building and delivering performant and scalable APIs. Possesses demonstrated expertise in team management and as an individual contributor. B. E. / B. Tech, MCA, M. E / M. Tech. This job was posted by Nikitaseles Pinto from Hashedin by Deloitte. Show more Show less

Oracle Application Technical Consultant

Gurugram, Haryana, India

8 years

Not disclosed

On-site

Full Time

Position - Technical Lead Location - Bangalore/Pune/Hyderabad/Gurugram/Kolkata/Chennai/Mumbai Experience - 8+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise , every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Data Integration Tech Lead (Oracle ODI) We are seeking an energetic and technically proficient Data Integration Tech Lead to design, build, and optimize robust data integration and analytics solutions using the Oracle technology stack. This role puts you at the core of our enterprise data modernization efforts, responsible for designing, implementing, and maintaining end-to-end data integration pipelines across traditional and cloud platforms. You will leverage your expertise in Oracle Data Integrator (ODI), Oracle Integration Cloud (OIC), and related technologies to drive efficient data movement, transformation, and loading while maintaining the highest standards of data quality, lineage, and governance. You will work hands-on and lead a small team of developers, shaping best practices for data integration workflows and collaborating with Analytics/BI teams to deliver fit-for-purpose solutions. Mandatory Skills: Experience: 6–8 years of progressive experience in enterprise data integration, with at least 4 years hands-on experience in Oracle Data Integrator (ODI). Strong understanding and working experience with Oracle Integration Cloud (OIC), Oracle databases, and related cloud infrastructure. Proven track record in designing and implementing large-scale ETL/ELT solutions across hybrid (on-prem/cloud) architectures. Technical Proficiency: Deep hands-on expertise with ODI components (Topology, Designer, Operator, Agent) and OIC (Integration patterns, adapters, process automation). Strong command of SQL and PL/SQL for data manipulation and transformation. Experience with REST/SOAP APIs, batch scheduling, and scripting (Python, Shell, or similar) for process automation. Data modeling proficiency (logical/physical, dimensional, OLAP/OLTP). Familiarity with Oracle Analytics Cloud (OAC), OBIEE, and integration into analytics platforms. Solid understanding of data quality frameworks, metadata management, and lineage documentation. Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Show more Show less

Data Engineer

Gurugram, Haryana, India

5 - 9 years

None Not disclosed

On-site

Full Time

POSITION - Software Engineer – Data Engineering LOCATION - Bangalore/Mumbai/Kolkata/Gurugram/Hyderabad/Pune/Chennai EXPERIENCE - 5-9 Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. JOB TITLE: Software Engineer – Data Engineering OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high quality code and data models, and drive best practices for data reliability, lineage, quality, and security Mandatory Skills: • Hands-on software coding or scripting for minimum 4 years • Experience in product management for at-least 4 years • Stakeholder management experience for at-least 4 years • Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: • Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). • Implement efficient solutions for high-volume, batch, real-time streaming, and eventdriven data processing, leveraging best-in-class patterns and frameworks. • Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. • Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). • Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. • Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation General Skills & Experience: • Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). • Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.) Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). • Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). • Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). • Strong SQL development skills for ETL, analytics, and performance optimization. • Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. • Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. • Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. • Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). • Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. • Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. • Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes EDUCATIONAL QUALIFICATIONS : • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). • Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). • Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. • Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus Show more Show less

Senior Engineering Manager

Bengaluru, Karnataka, India

0 years

None Not disclosed

On-site

Full Time

General Skills & Experience: Minimum 10-18 yrs of Experience • Expertise in Spark (Scala/Python), Kafka, and cloud-native big data services (GCP, AWS, Azure) for ETL, batch, and stream processing. • Deep knowledge of cloud platforms (AWS, Azure, GCP), including certification (preferred). • Experience designing and managing advanced data warehousing and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse). • Proven experience with building, managing, and optimizing ETL/ELT pipelines and data workflows for large-scale systems. • Strong experience with data lakes, storage formats (Parquet, ORC, Delta, Iceberg), and data movement strategies (cloud and hybrid). • Advanced knowledge of data modeling, SQL development, data partitioning, optimization, and database administration. • Solid understanding and experience with Master Data Management (MDM) solutions and reference data frameworks. • Proficient in implementing Data Lineage, Data Cataloging, and Data Governance solutions (e.g., AWS Glue Data Catalog, Azure Purview). • Familiar with data privacy, data security, compliance regulations (GDPR, CCPA, HIPAA, etc.), and best practices for enterprise data protection. • Experience with data integration tools and technologies (e.g. AWS Glue, GCP Dataflow , Apache Nifi/Airflow, etc.). • Expertise in batch and real-time data processing architectures; familiarity with event-driven, microservices, and message-driven patterns. • Hands-on experience in Data Analytics, BI & visualization tools (PowerBI, Tableau, Looker, Qlik, etc.) and supporting complex reporting use-cases. • Demonstrated capability with data modernization projects: migrations from legacy/on-prem systems to cloud-native architectures. • Experience with data quality frameworks, monitoring, and observability (data validation, metrics, lineage, health checks). • Background in working with structured, semi-structured, unstructured, temporal, and time series data at large scale. • Familiarity with Data Science and ML pipeline integration (DevOps/MLOps, model monitoring, and deployment practices). • Experience defining and managing enterprise metadata strategies.

Backend Developer - Java

Bengaluru, Karnataka, India

2 - 8 years

None Not disclosed

On-site

Full Time

Requirements 2 to 8 years of hands-on core Java and object-oriented design experience. An understanding of how Java works and its impact on the overall performance of an application. A demonstrable track record in developing and delivering high-quality, efficient, and optimized software solutions to tight deadlines. A robust understanding of test-driven development and design patterns. Able to develop unit-testable code using testing frameworks like JUnit, TestNG, and Cucumber. Experience with Spring, Maven, and Gradle. Good understanding of the principles of modern software development. Excellent communication skills with a proven ability to work with system users and senior stakeholders. Experience using relational databases. Proficient in SQL. The ability to work in a fast-paced, high-energy team environment. Proven ability to deliver high-quality software. Good understanding of agile development practices. HTML5 and CSS3 Secure coding experience; exceptional programming fundamentals and business logic on Core Java. Robust experience in Spring /Spring Boot (Framework) and Hibernate. Knowledge/experience with PYTHON is highly preferable. Good to have Angular, JSP, Bootstrap, CSS, Ajax, and JavaScript (UI Design). Good to have Web Services, REST (Middleware). At least 2-6 years of experience in developing enterprise Java J2EE applications. At least 2 years of experience in Agile Scrum framework environment development of software applications. Strong hands-on design and coding experience. Possess robust technical and functional knowledge of the project they have worked on. Ability to guide junior developers technically in the team. Must have an Agile mindset. This job was posted by Ranjan Kv from Hashedin by Deloitte.

Data Engineering Expert

Bengaluru, Karnataka, India

2 - 7 years

None Not disclosed

On-site

Full Time

Requirements B. E/B. Tech, MCA, M. E/M. Tech graduate with 2-7 years of experience. Should have an eye for architecture. Candidates should understand the trade-off between architectural choices, both on a theoretical level and an applied level. Strong coding/debugging/problem-solving abilities and should have advanced knowledge of at least one programming language - Python, Java, or Scala. Expertise in efficiently leveraging the power of distributed big data systems, including but not limited to Hadoop, Hive, Spark, Kafka streaming, etc. Experience in cloud-based data engineering service offerings from GCP/ AWS/ Azure, like S3 Redshift, Athena, Kinesis, etc. Strong understanding of SQL, columnar databases, and data warehousing concepts. Experience in building scalable and performant data pipelines, implementing custom ETLs, data lake, and data warehouse solutioning. Familiar with any common data visualization and exploration tools (Tableau, AWS Quicksight, Looker, etc. ). A technical background in data with a deep understanding of issues in multiple areas, such as data acquisition, ingestion and processing, data management, distributed processing, and high availability, is required. Quality delivery is the highest priority. Should know about industry best practices and standards in building and delivering performant and scalable data engineering projects. Must have delivered a complex project. Should have completely/partially owned delivery of data engineering projects, including building data pipelines, building data lakes and data warehouses, ETL solutioning, etc. Highly innovative, flexible, and self-directed. Excellent written and verbal communication skills. We're looking for software engineers who thrive on learning new technologies and don't believe in one-size-fits-all solutions. should be able to adapt easily to meet the needs of our massive growth and rapidly evolving business environment. You believe that you can achieve more in a team -- that the whole is greater than the sum of its parts. You rely on others' candid feedback for continuous improvement. Passion for technology, ability to switch contexts easily, enthusiasm to learn, and a passion to perform. Possesses demonstrated expertise in team management. Can articulate the business metrics in place at the account, understands, and can articulate their product's business value. Excellence in the Leads through Example stage of leadership. Strong skills in mentoring junior team members. This job was posted by Nikitaseles Pinto from Hashedin by Deloitte.

Senior Technical Product Manager

Gurugram, Haryana, India

10 years

None Not disclosed

On-site

Full Time

Position - Senior Technical Product Manager Location - Gurgaon/Chennai/Hyderabad/Pune/Kolkata/Mumbai/Bangalore Experience - 10+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. JOB TITLE - Senior Technical Product Manager Overview : Senior technical product manager in HashedIn is an independent leader who has skills of a product manager with a strong technical background. They are trusted advisor for all three – engineering team, executive leadership and business / SMEs. They have ability to bridge gaps between technical and business aspects of the project, requiring technical expertise along with product management skills. Mandatory skills 10+ years of experience in industry Hands-on software coding or scripting for minimum 3 years Experience in product management for at-least 5 years. Stakeholder management experience for at-least 5 years. Experience in one amongst GCP, AWS or Azure cloud platform Deep understanding and practical experience with agile frameworks and tools and understanding of APIs, and modern software development practices. Key Responsibilities: Product Planning Gather and prioritize requirements from stakeholders, market trends, customers, and end-users. Define and communicate product vision, strategy, and roadmap aligned with business objectives. Own the product backlog and ensure clarity and prioritization for the engineering team. Create and take ownership of product roadmap, feasibility and technical architecture, ensuring client business and technology team aligns. Focus on the "what" and "why" while defining product vision, strategy, and requirements based on business needs and user value Create, maintain, track and report product roadmaps, functional documentation, Definition of done, release timelines, and scope documentation regularly. Ensure NFRs are identified, agreed and documented for development team to implement Ensure that the project deliverables meet quality standards and comply with business requirements Technical Expertise: Serve as a bridge between engineering and non-technical stakeholders, translating complex technical concepts into business value. Possess a strong understanding of the technologies involved in the project and be able to make informed decisions about technology choices. Lead and facilitate technical discussions, architecture reviews, and solution design sessions with engineering teams. Evaluate and make decisions on technology choices, frameworks, and tools in collaboration with engineering leads. Help create technical documentation for the product, liaison with client architecture team for tech specification on various aspects. Propose trade-offs in architecture and / or product roadmaps to suit business Demonstrate and report tech depth and product understanding in front of executive leadership regularly Agile Methodologies: Adapt to changing project requirements and priorities Align ways of working with Agile – Scrum and Kanban – processes with ability to lead all ceremonies Work with UX/UI designers to ensure the product is user-friendly and meets user expectations. Collaborate with technology stakeholders – both client and internal – to capture quality metrics, success criterion, entry and exit criterions of every Epic/Story Technical & Business Communication Acts as the voice of the customer and business within the engineering team. Align cross-functional teams (engineering, design, QA, business) towards common goals. Communicate product release progress, risks, dependencies and issues to stakeholders, translating technical information into business terms. Identify, assess, and mitigate potential risks that could impact project success. Analyse product performance data and user behaviour to identify areas for improvement and inform product decisions. Work with Tech architects and business executives to triage product priorities Leadership Skills Identify and document success criterion and track them effectively Lead and motivate technical teams, assign tasks, provide guidance, and ensure team members have the understanding and resources they need to succeed. Act as point of contact for development team for requirement clarifications, and technical reviewer for architecture team. Lead and motivate teams, communicate effectively with stakeholders, and build strong relationships

Artificial Intelligence Engineer

Gurugram, Haryana, India

12 years

None Not disclosed

On-site

Full Time

Position - Senior AI Engineer and Data Scientist – Palantir Platform Location - Gurgaon/Chennai/Hyderabad/Pune/Kolkata/Mumbai/Bangalore Experience - 7 + Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. JOB TITLE - Senior AI Engineer and Data Scientist – Palantir Platform About the Role We are seeking a highly skilled Senior Data Scientist & AI Engineer to architect, develop, and deploy advanced analytics and AI/ML solutions on the Palantir platform (Foundry, AIP) and/or leading cloud platforms (AWS, Azure, GCP). You will drive end-to-end data science and AI engineering initiatives, leveraging both traditional and cutting-edge technologies to deliver impactful business outcomes. Key Responsibilities Lead end-to-end solution development for data science and AI/ML projects on Palantir Foundry/AIP or major cloud platforms (AWS, Azure, GCP). Own the full data science lifecycle: Data ingestion, cleansing, transformation, and integration from diverse sources Feature engineering, selection, and ontology/schema design Exploratory data analysis (EDA) and visualization Model development, training, tuning, validation, and deployment Production monitoring, drift detection, and model retraining Design and implement AI engineering solutions using Palantir AIP, including: Building and optimizing AIP Logic Functions Leveraging LLMs for translation, classification, and document/image parsing Implementing semantic search, schema matching, and data validation Developing and deploying computer vision models for media analysis Creating feedback loops and cross-validation workflows to enhance model performance Collaborate with cross-functional teams to translate business requirements into robust technical solutions. Mentor and guide junior team members in data science and AI engineering best practices. Stay current with emerging trends in AI/ML, cloud, and Palantir technologies. Required Skills & Experience Education: Bachelor’s or Master’s in Computer Science, Data Science, Engineering, Mathematics, or related field. Experience: 7–12 years in data science, machine learning, or AI engineering roles. Palantir Platform: 1–2 years hands-on experience with Palantir Foundry and/or AIP (preferred), or strong willingness to learn. Cloud Platforms: Proven experience with AWS, Azure, or GCP AI/ML services (e.g., SageMaker, Azure ML, Vertex AI). Core Data Science Competencies: Data ingestion, transformation, and integration Feature engineering and selection Ontology/schema creation and management EDA and data visualization Model training, evaluation, hyperparameter tuning, and deployment AI Engineering Competencies: Object relations and data modeling Building and deploying AI logic functions (AIP Logic) Working with LLMs (translation, classification, document parsing) Computer vision model development and image clustering Schema matching, semantic search, and data validation Feedback loop implementation and model retraining Programming: Proficiency in Python, SQL, and at least one additional language (e.g., Java, Scala, R). ML/AI Frameworks: Experience with Kubeflow, TensorFlow, PyTorch, scikit-learn, or similar. Visualization: Familiarity with matplotlib, seaborn, Power BI, Tableau, or Palantir visualization modules (Contour, Quiver). DevOps/MLOps: Experience with CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes) is a plus. Soft Skills: Strong analytical, problem-solving, and communication skills. Ability to work independently and collaboratively. Preferred Skills Experience with advanced LLMs (e.g., GPT-4, GPT-4o, Gemini, Claude, etc.) and production deployment. Familiarity with data governance, security, and compliance best practices (e.g., RBAC, audit logs, privacy). Certifications (Nice to Have) Palantir Foundry or AIP certifications AWS, Azure, or GCP ML certifications

Technical Product Specialist

Gurugram, Haryana, India

6 years

None Not disclosed

On-site

Full Time

Position - Technical Product Management Lead Location - Gurgaon/Chennai/Hyderabad/Pune/Kolkata/Mumbai/Bangalore Experience - 6+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. JOB TITLE - Technical Product Management Lead Overview : Technical Product Management Lead in HashedIn is an independent leader who has skills of a product manager with a strong technical background. They are trusted advisor for all three – engineering team, executive leadership and business / SMEs. They have ability to bridge gaps between technical and business aspects of the project, requiring technical expertise along with product management skills. Mandatory skills 6+ years of experience in industry Hands-on software coding or scripting for minimum 1-2 years Experience in product management for at-least 3 years. Stakeholder management experience for at-least 3 years. Experience in one amongst GCP, AWS or Azure cloud platform Deep understanding and practical experience with agile frameworks and tools and understanding of APIs, and modern software development practices. Key Responsibilities: Product Planning Gather and prioritize requirements from stakeholders, market trends, customers, and end-users. Define and communicate product vision, strategy, and roadmap aligned with business objectives. Own the product backlog and ensure clarity and prioritization for the engineering team. Create and take ownership of product roadmap, feasibility and technical architecture, ensuring client business and technology team aligns. Focus on the "what" and "why" while defining product vision, strategy, and requirements based on business needs and user value Create, maintain, track and report product roadmaps, functional documentation, Definition of done, release timelines, and scope documentation regularly. Ensure NFRs are identified, agreed and documented for development team to implement Ensure that the project deliverables meet quality standards and comply with business requirements Technical Expertise: Serve as a bridge between engineering and non-technical stakeholders, translating complex technical concepts into business value. Possess a strong understanding of the technologies involved in the project and be able to make informed decisions about technology choices. Lead and facilitate technical discussions, architecture reviews, and solution design sessions with engineering teams. Evaluate and make decisions on technology choices, frameworks, and tools in collaboration with engineering leads. Help create technical documentation for the product, liaison with client architecture team for tech specification on various aspects. Propose trade-offs in architecture and / or product roadmaps to suit business Demonstrate and report tech depth and product understanding in front of executive leadership regularly Agile Methodologies: Adapt to changing project requirements and priorities Align ways of working with Agile – Scrum and Kanban – processes with ability to lead all ceremonies Work with UX/UI designers to ensure the product is user-friendly and meets user expectations. Collaborate with technology stakeholders – both client and internal – to capture quality metrics, success criterion, entry and exit criterions of every Epic/Story Technical & Business Communication Acts as the voice of the customer and business within the engineering team. Align cross-functional teams (engineering, design, QA, business) towards common goals. Communicate product release progress, risks, dependencies and issues to stakeholders, translating technical information into business terms. Identify, assess, and mitigate potential risks that could impact project success. Analyse product performance data and user behaviour to identify areas for improvement and inform product decisions. Work with Tech architects and business executives to triage product priorities Leadership Skills Identify and document success criterion and track them effectively Act as point of contact for development team for requirement clarifications, and technical reviewer for architecture team. Lead and motivate teams, communicate effectively with stakeholders, and build strong relationships.

Technical Architect

Pune, Maharashtra, India

6 years

None Not disclosed

On-site

Full Time

Position - Technical Architect Location - Pune Experience - 6+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. JOB TITLE - Technical Architect B.E/B.Tech, MCA, M.E/M.Tech graduate with 6 -10 Years of experience (This includes 4 years of experience as an application architect or data architect) • Java/Python/UI/DE • GCP/AWS/AZURE • Generative AI-enabled application design pattern knowledge is a value addition. • Excellent technical background with a breadth of knowledge across analytics, cloud architecture, distributed applications, integration, API design, etc • Experience in technology stack selection and the definition of solution, technology, and integration architectures for small to mid-sized applications and cloud-hosted platforms. • Strong understanding of various design and architecture patterns. • Strong experience in developing scalable architecture. • Experience implementing and governing software engineering processes, practices, tools, and standards for development teams. • Proficient in effort estimation techniques; will actively support project managers and scrum masters in planning the implementation and will work with test leads on the definition of an appropriate test strategy for the realization of a quality solution. • Extensive experience as a technology/ engineering subject matter expert i. e. high level • Solution definition, sizing, and RFI/RFP responses. • Aware of the latest technology trends, engineering processes, practices, and metrics. • Architecture experience with PAAS and SAAS platforms hosted on Azure AWS or GCP. • Infrastructure sizing and design experience for on-premise and cloud-hosted platforms. • Ability to understand the business domain & requirements and map them to technical solutions. • Outstanding interpersonal skills. Ability to connect and present to CXOs from client organizations. • Strong leadership, business communication consulting, and presentation skills. • Positive, service-oriented personality OVERVIEW OF THE ROLE: This role serves as a paradigm for the application of team software development processes and deployment procedures. Additionally, the incumbent actively contributes to the establishment of best practices and methodologies within the team. Craft & deploy resilient APIs, bridging cloud infrastructure & software development with seamless API design, development, & deployment • Works at the intersection of infrastructure and software engineering by designing and deploying data and pipeline management frameworks built on top of open-source components, including Hadoop, Hive, Spark, HBase, Kafka streaming, Tableau, Airflow, and other cloud-based data engineering services like S3, Redshift, Athena, Kinesis, etc. • Collaborate with various teams to build and maintain the most innovative, reliable, secure, and cost-effective distributed solutions. • Design and develop big data and real-time analytics and streaming solutions using industry-standard technologies. • Deliver the most complex and valuable components of an application on time as per the specifications. • Plays the role of a Team Lead, manages, or influences a large portion of an account or small project in its entirety, demonstrating an understanding of and consistently incorporating practical value with theoretical knowledge to make balanced technical decisions

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview