Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices to remain current in the field. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI, Tableau, Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Experience with RGM.ai product would have an added advantage. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyse data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description NXP Semiconductors enables secure connections and infrastructure for a smarter world, advancing solutions that make lives easier, better and safer. As the world leader in secure connectivity solutions for embedded applications, we are driving innovation in the secure connected vehicle, end-to-end security & privacy and smart connected solutions markets. Organization Description Do you feel challenged by being part of the IT department of NXP, the company with a mission of “Secure Connections for a Smarter World”? Do you perform best in a role representing IT in projects in a fast moving, international environment? Within R&D IT Solutions, the Product Creation Applications (PCA) department is responsible for providing and supporting the R&D design community globally with best-in-class applications and support. The applications are used by over 6,000 designers. Job Summary As a Graph Engineer, you will: Develop pipelines and code to support the ingress and egress of this data to and from the knowledge graphs. Perform basic and advanced graph querying and data modeling on the knowledge graphs that lie at the heart of the organization's Product Creation ecosystem. Maintain the (ETL) pipelines, code and Knowledge Graph to stay scalable, resilient and performant in line with customer’s requirements. Work in an international and Agile DevOps environment. This position offers an opportunity to work in a globally distributed team where you will get a unique opportunity of personal development in a multi-cultural environment. You will also get a challenging environment to develop expertise in the technologies useful in the industry. Primary Responsibilities Translate requirements of business functions into “Graph-Thinking”. Build and maintain graphs and related applications from data and information, using latest graph technologies to leverage high value use cases. Support and manage graph databases. Integrate graph data from various sources – internal and external. Extract data from various sources, including databases, APIs, and flat files. Load data into target systems, such as data warehouses and data lakes. Develop code to move data (ETL) from the enterprise platform applications into the enterprise knowledge graphs. Optimize ETL processes for performance and scalability. Collaborate with data engineers, data scientists and other stakeholders to model the graph environment to best represent the data coming from the multiple enterprise systems. Skills / Experience Semantic Web technologies: RDF RDFS, OWL, SHACL SPARQL JSON-LD, N-Triples/N-Quads, Turtle, RDF/XML, TriX API-led architectures REST, SOAP Microservices API Management Graph databases, such as Dydra, Amazon Neptune, Neo4J, Oracle Spatial & Graph is a plus Experience with other NoSQL databases, such as key-value databases and document-based databases (e.g. XML databases) is a plus Experience with relational databases Programming experience, preferably Java, JavaScript, Python, PL/SQL Experience with web technologies: HTML, CSS, XML, XSLT, XPath Experience with modelling languages such as UML Understanding of CI/CD automation, version control, build automation, testing frameworks, static code analysis, IT service management, artifact management, container management, and experience with related tools and platforms. Familiarity with Cloud computing concepts (e.g. in AWS and Azure). Education & Personal Skillsets A master’s or bachelor’s degree in the field of computer science, mathematics, electronics engineering or related discipline with at least 10 plus years of experience in a similar role Excellent problem-solving and analytical skills A growth mindset with a curiosity to learn and improve. Team player with strong interpersonal, written, and verbal communication skills. Business consulting and technical consulting skills. An entrepreneurial spirit and the ability to foster a positive and energized culture. You can demonstrate fluent communication skills in English (spoken and written). Experience working in Agile (Scrum knowledge appreciated) with a DevOps mindset. More information about NXP in India... Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Tata Consultancy Services is hiring Python Full stack Developers !!! Role**Python Full stack Developer Desired Experience Range**6-8 YEARS Location of Requirement**Hyderabad And Kolkata Desired Skills -Technical/Behavioral Frontend o 6+ years of overall experience with proficiency in React (2+ years), Typescript (1+ year), React hooks (1+ year) o Experience with ESlint, CSS in JS styling (preferably Emotion), state management (preferably Redux), and JavaScript bundlers such as Webpack o Experience with integrating with RESTful APIs or other web services Backend o Expertise with Python (3+ years, preferably Python3) o Proficiency with a Python web framework (2+ years, preferably flask and FastAPI) o Experience with a Python linter (preferably flake8), graph databases (preferably Neo4j), a package manager (preferably pip), Elasticsearch, and Airflow o Experience with developing microservices, RESTful APIs or other web services o Experience with Database design and management, including NoSQL/RDBMS tradeoffs Interested and eligible candidates can apply !!! Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Develop strategic design and requirements on small systems or modules of large systems (large scale). Perform general application development activities, including unit testing, code deployment to a development environment, and technical documentation. What you’ll do Solve unique and complex problems with a broad impact on the business Build large, complex projects to achieve key business objectives Translate highly complex concepts in ways that can be understood by a variety of audiences Deploy and maintain Applications / Systems Working with product owners, UX, and other business partners to define work for the team Facilitating code reviews, code quality checks, testing, automation, etc. Ensure integrated end-to-end design What Experience You Need 5+ years as a Full-stack developer with experience in client-side JavaScript frameworks (preferably Angular, TypeScript), Node.JS, NPM, server-side frameworks (Java / Spring / Spring Cloud / Hibernate / SpringBoot Microservices) 3+ years of experience working with Cloud environments (GCP and AWS) 3+ years of experience with Microservices /Rest services / Soap) 2+ years of experience with Postgres/ Oracle / MySQL / NoSQL databases (MongoDB, Cassandra, Neo4J) 2+ years of experience with Node.js, React, Backbone, or other client-side MVC technologies 2+ years of experience with unit and automation testing (Jasmine, Protractor, JUnit) Experience in continuous integration build tools (Jenkins, SonarQube, JIRA, Nexus, Confluence, GIT-BitBucket, Maven, Gradle, RunDeck) REST API design and implementation Knowledge of Java build tools and dependency management (gradle, maven) English proficiency B2 or above What could set you apart SRE experience Strong interpersonal skills as well as strong teamwork and customer support focus Aggressive problem diagnosis and creative problem-solving skills on highly complex problems; technical agility Experience with API Frameworks (APIGEE) Experience working with Agile methodologies Experience in UNIX or Linux (is a plus) Experience with CSS preprocessors (less, sass) Familiar with secure development best practices. Experience creating responsive designs (Bootstrap, mobile, etc.) Knowledge of Security principles (Encryption, Authentication/Authorization etc) We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Develop strategic design and requirements on small systems or modules of large systems (large scale). Perform general application development activities, including unit testing, code deployment to a development environment, and technical documentation. What you’ll do Solve unique and complex problems with a broad impact on the business Build large, complex projects to achieve key business objectives Translate highly complex concepts in ways that can be understood by a variety of audiences Deploy and maintain Applications / Systems Working with product owners, UX, and other business partners to define work for the team Facilitating code reviews, code quality checks, testing, automation, etc. Ensure integrated end-to-end design What Experience You Need 5+ years as a Full-stack developer with experience in client-side JavaScript frameworks (preferably Angular, TypeScript), Node.JS, NPM, server-side frameworks (Java / Spring / Spring Cloud / Hibernate / SpringBoot Microservices) 3+ years of experience working with Cloud environments (GCP and AWS) 3+ years of experience with Microservices /Rest services / Soap) 2+ years of experience with Postgres/ Oracle / MySQL / NoSQL databases (MongoDB, Cassandra, Neo4J) 2+ years of experience with Node.js, React, Backbone, or other client-side MVC technologies 2+ years of experience with unit and automation testing (Jasmine, Protractor, JUnit) Experience in continuous integration build tools (Jenkins, SonarQube, JIRA, Nexus, Confluence, GIT-BitBucket, Maven, Gradle, RunDeck) REST API design and implementation Knowledge of Java build tools and dependency management (gradle, maven) English proficiency B2 or above What could set you apart SRE experience Strong interpersonal skills as well as strong teamwork and customer support focus Aggressive problem diagnosis and creative problem-solving skills on highly complex problems; technical agility Experience with API Frameworks (APIGEE) Experience working with Agile methodologies Experience in UNIX or Linux (is a plus) Experience with CSS preprocessors (less, sass) Familiar with secure development best practices. Experience creating responsive designs (Bootstrap, mobile, etc.) Knowledge of Security principles (Encryption, Authentication/Authorization etc) Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Role: Technical Architect Experience: 8-15 years Location: Bangalore, Chennai, Gurgaon, Pune, and Kolkata Mandatory Skills: Python, Pyspark, SQL, ETL, Pipelines, Azure Databricks, Azure Data Factory, & Architect Designing. Primary Roles and Responsibilities: Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills and Qualifications: Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 8+ yrs. of IT experience and 5+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Show more Show less
Posted 1 month ago
5.0 - 10.0 years
20 - 35 Lacs
Pune, Gurugram
Work from Office
In one sentence We are seeking a highly skilled and adaptable Senior Python Developer to join our fast-paced and dynamic team. The ideal candidate is a hands-on technologist with deep expertise in Python and a strong background in data engineering, cloud platforms, and modern development practices. You will play a key role in building scalable, high-performance applications and data pipelines that power critical business functions. You will be instrumental in designing and developing high-performance data pipelines from relational to graph databases, and leveraging Agentic AI for orchestration. Youll also define APIs using AWS Lambda and containerised services on AWS ECS. Join us on an exciting journey where you'll work with cutting-edge technologies including Generative AI, Agentic AI, and modern cloud-native architectureswhile continuously learning and growing alongside a passionate team. What will your job look like? Key Attributes: Adaptability & Agility Thrive in a fast-paced, ever-evolving environment with shifting priorities. Demonstrated ability to quickly learn and integrate new technologies and frameworks. Strong problem-solving mindset with the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark. Define and implement smart data pipelines from RDBMS to Graph Databases. Build and expose APIs using AWS Lambda and ECS-based microservices. Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and scalable code following best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required. Ensure security, compliance, and observability across all development activities. All you need is... Required Skills & Experience Expert-level proficiency in Python with a strong grasp of Object oriented & functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions.
Posted 1 month ago
3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Team Leader – Full Stack & GenAI Projects Location: Mumbai, Work From Office Reporting To: Project Manager Experience: 2–3 years Employment Type: Full-time Job Summary We are looking for a motivated and responsible Team Leader to manage the delivery of full stack development projects with a focus on Generative AI applications . You will lead a team of 3–5 developers, ensure high-quality deliverables, and collaborate closely with the project manager to meet deadlines and client expectations. Key Responsibilities Lead the design, development, and deployment of web-based software solutions using modern full stack technologies Guide and mentor a team of 3–5 developers; assign tasks and monitor progress Take ownership of project deliverables and ensure timely, quality outcomes Collaborate with cross-functional teams including UI/UX, DevOps, and QA Apply problem-solving skills to address technical challenges and design scalable solutions Contribute to the development of GenAI-based modules and features Ensure adherence to coding standards, version control, and agile practices Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field 2–3 years of experience in full stack development (front-end + back-end) Proficiency in one or more tech stacks (e.g., React/Angular + Node.js/Java/Python) Solid understanding of databases, REST APIs, and version control (Git) Strong problem-solving skills and ability to work independently Excellent programming, debugging, and team collaboration skills Exposure to Generative AI frameworks or APIs is a strong plus Willingness to work from office full-time Nice to Have Experience in leading or mentoring small teams Familiarity with cloud platforms (AWS, GCP, or Azure) Knowledge of CI/CD practices and Agile methodologies About Us Cere Labs is a Mumbai based company working in the field of Artificial Intelligence. It is a product company that utilizes the latest technologies such as Python, Redis, neo4j, MVC, Docker, Kubernetes to build its AI platform. Cere Labs’ clients are primarily from the Banking and Finance domain in India and US. The company has a great environment for its employees to learn and grow in technology. Skills:- Python, React.js and Spring Boot Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. We believe in the power of people. We are a network strategy and technology company that is motivated by making a difference in people’s lives – their productivity, creativity, health and comfort. We’re looking for a highly motivated, talented and experienced engineer who is passionate about product verification automation activities and is ready to assume a leadership position within the team in addressing future projects. You will certify solutions that provide our customers opportunities to differentiate their service offerings in a very competitive market. The ideal candidate is a flexible, highly technical problem solver, with interdisciplinary knowledge of software, and test & test automation. You feel at home in a dynamic, multi-disciplined engineering environment, acting as an interface between product design, other Blue Planet test engineering teams, and members of other functional groups (support, documentation, marketing, etc). RESPONSIBILITIES Engage with various engineering teams, product line managers and product owners to transform concepts and high-level requirements into optimized test coverage and enhanced customer experience. Automate and maintain all manually devised and executed test cases using automation best practices and maintain CI/CD pipeline framework Coding E2E Automated tests for the Angular UI frontend with Cucumber/Webdriver.io. Coding Rest API testing automation Coding of System testing with ansible, bash scripting Drive (plan and implement) lab or simulation environment setup activities to fully address proposed testing scenarios and coordinate equipment acquisition/sharing agreements with the various teams concerned. Analyse test results and prepare test reports. Investigate software defects and highlight critical issues that can have potential customer impact and consult with software development engineers in finding resolution or to address problems related to specifications and/or test plans/procedures. Raise Agile Jira bugs for product defects Report on automation status Research the best tools/ways of test automation for required functionality Skills Expected from the candidate: Frontend testing frameworks/libraries: Cucumber/Webdriver.io Backend programming/markup languages: Python Backend testing: Rest API testing automation tools, Postman/Newman, Jasmine Load testing: JMeter, Grafana + Prometheus Container management: Docker, Kubernetes, OpenStack Testing Theory: terminology, testing types, asynchronous automated testing Continuous Integration Tools: Jenkins, TeamCity, GitLab Cloud Environments: AWS, Azure, Google cloud Version control system: Git, Bitbucket System Testing Automation with: Bash, Shell, Python, Ansible scripting Hands-on experience of CI/CD pipeline configuration and maintenance Solid operational and administrator experience with Unix operation systems Understanding of Web application and Microservice solution architecture Strong abilities to rapidly learn new complex technological concepts and apply knowledge in daily activities. Excellent written (documentation) and interpersonal communication skills (English). Strong abilities to work as part of a team or independently with little supervision. Experienced working as part of an Agile scrum team and with DevOps process Desirable For The Candidate Ticketing: Jira Documentation: Confluence, Gitlab Frontend programming/markup languages: Typescript/JavaScript, html, CSS, SVG Frontend development frameworks/libraries: Angular 2+, Node.js/npm, D3.js, gulp Programming theory: algorithms and data structures, relational and graph database concepts, etc. Non-critical Extras Domain: Telecom, Computer Networking, OSS Builds: Maven, NPM, JVM, NodeJS Databases: PostgreSQL, Neo4j, ClickHouse Test Management: TestRail Other Skills: ElasticSearch, Drools, Kafka integration, REST (on Spring MVC), SSO (LDAP, Reverse Proxy, OAuth2) Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require. Show more Show less
Posted 1 month ago
4.0 - 6.0 years
7 - 10 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be part of Researchs Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. Basic Qualifications and Experience: Doctorate Degree OR Masters degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation Expertise: Good experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-Solving: Excellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills
Posted 1 month ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Overview: The Technology Solution Delivery - Front Line Manager (M1) is responsible for providing leadership and day-to-day direction to a cross functional engineering team. This role involves establishing and executing operational plans, managing relationships with internal and external customers, and overseeing technical fulfillment projects. The manager also supports sales verticals in customer interactions and ensures the delivery of technology solutions aligns with business needs. What you will do: Build strong relationships with both internal and external stakeholders including product, business and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters. Mentor, coach and develop junior and senior software, quality and reliability engineers. Collaborate with the architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs. Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Drive technical documentation including support, end user documentation and run books. Lead Sprint planning, Sprint Retrospectives, and other team activities Implement architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate Provides coaching, leadership and talent development; ensures teams functions as a high-performing team; able to identify performance gaps and opportunities for upskilling and transition when necessary. Drives culture of accountability through actions and stakeholder engagement and expectation management Develop the long-term technical vision and roadmap within, and often beyond, the scope of your teams. Oversee systems designs within the scope of the broader area, and review product or system development code to solve ambiguous problems Identify and resolve problems affecting day-to-day operations Set priorities for the engineering team and coordinate work activities with other supervisors Cloud Certification Strongly Preferred What experience you need: BS or MS degree in a STEM major or equivalent job experience required 10+ years’ experience in software development and delivery You adore working in a fast paced and agile development environment You possess excellent communication, sharp analytical abilities, and proven design skills You have detailed knowledge of modern software development lifecycles including CI / CD You have the ability to operate across a broad and complex business unit with multiple stakeholders You have an understanding of the key aspects of finance especially as related to Technology. Specifically including total cost of ownership and value You are a self-starter, highly motivated, and have a real passion for actively learning and researching new methods of work and new technology You possess excellent written and verbal communication skills with the ability to communicate with team members at various levels, including business leaders What Could Set You Apart UI development (e.g. HTML, JavaScript, AngularJS, Angular4/5 and Bootstrap) Source code control management systems (e.g. SVN/Git, Subversion) and build tools like Maven Big Data, Postgres, Oracle, MySQL, NoSQL databases (e.g. Cassandra, Hadoop, MongoDB, Neo4J) Design patterns Agile environments (e.g. Scrum, XP) Software development best practices such as TDD (e.g. JUnit), automated testing (e.g. Gauge, Cucumber, FitNesse), continuous integration (e.g. Jenkins, GoCD) Linux command line and shell scripting languages Relational databases (e.g. SQL Server, MySQL) Cloud computing, SaaS (Software as a Service) Atlassian tooling (e.g. JIRA, Confluence, and Bitbucket) Experience working in financial services Experience working with open source frameworks; preferably Spring, though we would also consider Ruby, Apache Struts, Symfony, Django, etc. Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Behaviors: Customer-focused with a drive to exceed expectations. Demonstrates integrity and accountability. Intellectually curious and driven to innovate. Values diversity and fosters collaboration. Results-oriented with a sense of urgency and agility. Show more Show less
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company: Indian / Global Engineering & Manufacturing Organization Key Skills: ETL/ELT, RDF, OWL, SPARQL, Neo4j, AWS Neptune, ArangoDB, Python, SQL, Cypher, Semantic Modeling, Cloud Data Pipelines, Data Quality, Knowledge Graph, Graph Query Optimization, Semantic Search. Roles and Responsibilities: Design and build advanced data pipelines for integrating structured and unstructured data into graph models. Develop and maintain semantic models using RDF, OWL, and SPARQL. Implement and optimize data pipelines on cloud platforms such as AWS, Azure, or GCP. Model real-world relationships through ontologies and hierarchical graph data structures. Work with graph databases such as Neo4j, AWS Neptune, ArangoDB for knowledge graph development. Collaborate with cross-functional teams including AI/ML and business analysts to support semantic search and analytics. Ensure data quality, security, and compliance throughout the pipeline lifecycle. Monitor, debug, and enhance performance of graph queries and data transformation workflows. Create clear documentation and communicate technical concepts to non-technical stakeholders. Participate in global team meetings and knowledge-sharing sessions to align on data standards and architectural practices. Experience Requirement: 3-8 years of hands-on experience in ETL/ELT engineering and data integration. Experience working with graph databases such as Neo4j, AWS Neptune, or ArangoDB. Proven experience implementing knowledge graphs, including semantic modeling using RDF, OWL, and SPARQL. Strong Python and SQL programming skills, with proficiency in Cypher or other graph query languages. Experience designing and deploying pipelines on cloud platforms (AWS preferred). Track record of resolving complex data quality issues and optimizing pipeline performance. Previous collaboration with data scientists and product teams to implement graph-based analytics or semantic search features. Education: Any Graduation. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Hiring Location: Mumbai/Chennai/Gurgaon Job Summary We are seeking a Lead I in Software Engineering with 4 to 7 years of experience in software development or software architecture. The ideal candidate will possess a strong background in Angular and Java, with the ability to lead a team and drive technical projects. A Bachelor's degree in Engineering or Computer Science, or equivalent experience, is required. Responsibilities Interact with technical personnel and team members to finalize requirements. Write and review detailed specifications for the development of system components of moderate complexity. Collaborate with QA and development team members to translate product requirements into software designs. Implement development processes, coding best practices, and conduct code reviews. Operate in various development environments (Agile, Waterfall) while collaborating with key stakeholders. Resolve technical issues as necessary. Perform all other duties as assigned. Must-Have Skills Strong proficiency in Angular 1.X (70% Angular and 30% Java OR 50% Angular and 50% Java). Java/J2EE; Familiarity with Singleton and MVC design patterns. Strong proficiency in SQL and/or MySQL, including optimization techniques (at least MySQL). Experience using tools such as Eclipse, GIT, Postman, JIRA, and Confluence. Knowledge of test-driven development. Solid understanding of object-oriented programming. Good-to-Have Skills Expertise in Spring Boot, Microservices, and API development. Familiarity with OAuth2.0 patterns (experience with at least 2 patterns). Knowledge of Graph Databases (e.g., Neo4J, Apache Tinkerpop, Gremlin). Experience with Kafka messaging. Familiarity with Docker, Kubernetes, and cloud development. Experience with CI/CD tools like Jenkins and GitHub Actions. Knowledge of industry-wide technology trends and best practices. Experience Range 4 to 7 years of relevant experience in software development or software architecture. Education Bachelor’s degree in Engineering, Computer Science, or equivalent experience. Additional Information Strong communication skills, both oral and written. Ability to interface competently with internal and external technology resources. Advanced knowledge of software development methodologies (Agile, etc.). Experience in setting up and maintaining distributed applications in Unix/Linux environments. Ability to complete complex bug fixes and support production issues. Skills Angular 1.X,Java 11+,Sql The expectation is 60-70% in Angular primarily and 30-40% in Java. Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
TCS Hiring !!! Role: Python Full Stack Developer Experience: 6-8 years Location: Hyderabad & Kolkata Job Description: 6+ years of overall experience with proficiency in React (2+ years), Typescript (1+ year), React hooks (1+ year) Experience with ESlint, CSS in JS styling (preferably Emotion), state management (preferably Redux), and JavaScript bundlers such as Webpack Experience with integrating with RESTful APIs or other web services Expertise with Python (3+ years, preferably Python3) Proficiency with a Python web framework (2+ years, preferably flask and FastAPI) Experience with a Python linter (preferably flake8), graph databases (preferably Neo4j), a package manager (preferably pip), Elasticsearch, and Airflow Experience with developing microservices, RESTful APIs or other web services Experience with Database design and management, including NoSQL/RDBMS tradeoffs Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Technical Expertise : (minimum 2 year relevant experience) ● Solid understanding of Generative AI models and Natural Language Processing (NLP) techniques, including Retrieval-Augmented Generation (RAG) systems, text generation, and embedding models. ● Exposure to Agentic AI concepts, multi-agent systems, and agent development using open-source frameworks like LangGraph and LangChain. ● Hands-on experience with modality-specific encoder models (text, image, audio) for multi-modal AI applications. ● Proficient in model fine-tuning, prompt engineering, using both open-source and proprietary LLMs. ● Experience with model quantization, optimization, and conversion techniques (FP32 to INT8, ONNX, TorchScript) for efficient deployment, including edge devices. ● Deep understanding of inference pipelines, batch processing, and real-time AI deployment on both CPU and GPU. ● Strong MLOps knowledge with experience in version control, reproducible pipelines, continuous training, and model monitoring using tools like MLflow, DVC, and Kubeflow. ● Practical experience with scikit-learn, TensorFlow, and PyTorch for experimentation and production-ready AI solutions. ● Familiarity with data preprocessing, standardization, and knowledge graphs (nice to have). ● Strong analytical mindset with a passion for building robust, scalable AI solutions. ● Skilled in Python, writing clean, modular, and efficient code. ● Proficient in RESTful API development using Flask, FastAPI, etc., with integrated AI/ML inference logic. ● Experience with MySQL, MongoDB, and vector databases like FAISS, Pinecone, or Weaviate for semantic search. ● Exposure to Neo4j and graph databases for relationship-driven insights. ● Hands-on with Docker and containerization to build scalable, reproducible, and portable AI services. ● Up-to-date with the latest in GenAI, LLMs, Agentic AI, and deployment strategies. ● Strong communication and collaboration skills, able to contribute in cross-functional and fast-paced environments. Bonus Skills ● Experience with cloud deployments on AWS, GCP, or Azure, including model deployment and model inferencing. ● Working knowledge of Computer Vision and real-time analytics using OpenCV, YOLO, and similar Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description: The next evolution of AI powered cyber defense is here. With the rise of cloud and modern technologies, organizations struggle with the vast amount of data and thereby security alerts generated by their existing security tools. Cyberattacks continue to get more sophisticated and harder to detect in the sea of alerts and false positives. According to the Forrester 2023 Enterprise Breach Benchmark Report, a security breach costs organizations an average of $3M and takes organizations over 200 days to investigate and respond. AiStrike’s platform aims at reducing the time to investigate and respond to threats by over 90%. Our approach is to leverage the power of AI and machine learning to adopt an attacker mindset to prioritize and automate cyberthreat investigation and response. The platform reduces alerts by 100:5 and provides detailed context and link analysis capabilities to investigate the alert. The platform also provides collaborative workflow and no code automation to cut down the time to respond to threats significantly. If you have the desire to join the next evolution of cyber defense, are willing to work hard and learn fast, and be part of building something special, this is the company for you. We are seeking a highly skilled and experienced hands-on Principal Software Engineer with over 10+ years of proven expertise in the field. As a Principal Architect, you will play a crucial role in leading the architecture, designing, and implementing scalable cloud solutions for our Cloud-native SaaS products. The ideal candidate will have significant experience and a strong background in object-oriented design and coding skills, with hands-on experience in Java and Python. Roles and Responsibilities: Manage overarching product/platform architecture, and technology selection and make sure that the design and development of all projects follow the architectural vision Design and architect scalable cloud solutions for Cloud-native SaaS development projects in line with the latest technology and practices Successfully communicate, evangelize, and implement the architectural vision across teams and products Design and coordinate projects of significant size and complexity Work with containerization technologies and orchestration software such as Kubernetes on cloud platforms like AWS and Azure. Develop and implement Microservices-based architecture using Java, SpringBoot, ReactJS, NextJS, and other relevant technologies. Implement secure design principles and practices, ensuring the integrity and confidentiality of our systems. Collaborate with cross-geography cross-functional teams to define and refine requirements and specifications. Deploy workloads at scale in AWS EKS/ECS environments and others as needed Create automation and use monitoring tools to efficiently build, deploy and support cloud implementations. Implement DevOps methodologies and tools for continuous integration and delivery. Utilize APM and Monitoring Tools like ELK, Splunk, Datadog, Dynatrace, and Appdynamics for cloud-scale monitoring. Work with potential customers to understand their environment. Provide technical leadership, architecture guidance, and mentorship to the teams. Have a clear focus on scale, cost, security, and maintainability. Stay updated on industry best practices, emerging technologies, and cybersecurity trends. Skills and Qualifications: 10+ years of overall experience in software development and architecture. In depth knowledge and experience in Cloud-native SaaS development and architecture. Proficient in Java, Python, RESTful APIs, API Gateway, Kafka, and Microservices communications. Experience with RDBMS and NoSQL databases (e.g., Neo4J, MongoDB, Redis). Experience in working with Graph databases like Neo4J. Expertise in containerization technologies (Docker) and Kubernetes. Hands-on experience with secure DevOps practices. Familiarity with Multi-Factor Authentication and Single Sign-On principles. Excellent verbal and written communication skills. Self-starter with strong organizational and problem-solving skills. Prior experience in deploying workloads at scale in AWS EKS/ECS/Fargate. Knowledge of Cloud-scale APM and Monitoring Tools (ELK, Splunk, Datadog, etc.). Previous experience in Cybersecurity products is desirable but not mandatory. Preferred: AWS Certified Solutions Architect – Professional or similar certification, including certifications on other cloud platforms. Commitment, team player, integrity and customer focus AiStrike is committed to providing equal employment opportunities. All qualified applicants and employees will be considered for employment and advancement without regard to race, color, religion, creed, national origin, ancestry, sex, gender, gender identity, gender expression, physical or mental disability, age, genetic information, sexual or affectional orientation, marital status, status regarding public assistance, familial status, military or veteran status or any other status protected by applicable law. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Thiruporur, Tamil Nadu, India
On-site
Job Description Join us as a Domain Architect in the Autonomous Network domain and be a part of our success journey! In this role, you’ll have an opportunity to shape innovative solutions and make a real impact. As an advisor, you’ll work closely with stakeholders to address their unique needs and translate them into practical, high-value solutions. With a focus on industry best practices and architectural excellence, you'll help customers achieve their business goals with confidence, while growing your expertise in a dynamic, forward-thinking environment. Join us and be a part of something extraordinary! How You Will Contribute And What You Will Learn Develop a Requirement Definition Document (RDD), High-Level Design (HLD), and Low-Level Design (LLD). Stay updated on customer architecture within the dedicated technical area and regional requirements. Apply solution architecture standards, processes, and principles. Define and develop the full scope of solutions, working across different teams and organizations to create effective outcomes. Work effectively in diverse environments, leveraging best practices and industry knowledge to enhance products and services. Serve as an advisor and mentor to team members, guiding projects and tasks. Guide and drive projects with manageable risks and resource requirements or oversee small teams, managing day-to-day operations, resource allocation, and workload distribution. Act as a key troubleshooter and subject matter expert on the Autonomous product portfolio, including fulfillment, assurance, inventory, security, and analytics. Key Skills And Experience You have: Bachelor’s degree in engineering/technology or equivalent with 10+ years of hands-on experience in autonomous networks driving large programs, and should have worked as an Architect/Designer for at least 5 years. Experience in at least one or two domains like Orchestration/fulfillment/Flowone/CDPA/CDFF/NoRC; Assurance/NAC; InventoryUIV, Discovery and Reconciliation domain; SSO/Security product suites/NIAM; Analytics Hands-on experience in Java, Expect scripting, Python, Kubernetes, Microservices, Databases, XML, XSLT, Data Parsing, SNMP, REST, SOAP, CORBA, LDAP, JMS, and FTP. Exposure to Oracle, Postgres, MongoDB, MariaDB, Neo4J, containerization, orchestration tools, agile methodologies It would be nice if you also had: Understanding of 5G Slicing, 5G SA/NSA network, IP/MPLS, Optics, IMS, VoLTE, NFV/SDN, Fixed network Independent, disruptive thinking with a results-oriented mindset and strong communication skills. Ability to work in a fast-paced global environment with cross-cultural teams and customers. About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team As Nokia's growth engine, we create value for communication service providers and enterprise customers by leading the transition to cloud-native software and as-a-service delivery models. Our inclusive team of dreamers, doers and disruptors push the limits from impossible to possible. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Join us as a Domain Architect in the Autonomous Network domain and be a part of our success journey! In this role, you’ll have an opportunity to shape innovative solutions and make a real impact. As an advisor, you’ll work closely with stakeholders to address their unique needs and translate them into practical, high-value solutions. With a focus on industry best practices and architectural excellence, you'll help customers achieve their business goals with confidence, while growing your expertise in a dynamic, forward-thinking environment. Join us and be a part of something extraordinary! How You Will Contribute And What You Will Learn Develop a Requirement Definition Document (RDD), High-Level Design (HLD), and Low-Level Design (LLD). Stay updated on customer architecture within the dedicated technical area and regional requirements. Apply solution architecture standards, processes, and principles. Define and develop the full scope of solutions, working across different teams and organizations to create effective outcomes. Work effectively in diverse environments, leveraging best practices and industry knowledge to enhance products and services. Serve as an advisor and mentor to team members, guiding projects and tasks. Guide and drive projects with manageable risks and resource requirements or oversee small teams, managing day-to-day operations, resource allocation, and workload distribution. Act as a key troubleshooter and subject matter expert on the Autonomous product portfolio, including fulfillment, assurance, inventory, security, and analytics. Key Skills And Experience You have: Bachelor’s degree in engineering/technology or equivalent with 10+ years of hands-on experience in autonomous networks driving large programs, and should have worked as an Architect/Designer for at least 5 years. Experience in at least one or two domains like Orchestration/fulfillment/Flowone/CDPA/CDFF/NoRC; Assurance/NAC; InventoryUIV, Discovery and Reconciliation domain; SSO/Security product suites/NIAM; Analytics Hands-on experience in Java, Expect scripting, Python, Kubernetes, Microservices, Databases, XML, XSLT, Data Parsing, SNMP, REST, SOAP, CORBA, LDAP, JMS, and FTP. Exposure to Oracle, Postgres, MongoDB, MariaDB, Neo4J, containerization, orchestration tools, agile methodologies It would be nice if you also had: Understanding of 5G Slicing, 5G SA/NSA network, IP/MPLS, Optics, IMS, VoLTE, NFV/SDN, Fixed network Independent, disruptive thinking with a results-oriented mindset and strong communication skills. Ability to work in a fast-paced global environment with cross-cultural teams and customers. About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team As Nokia's growth engine, we create value for communication service providers and enterprise customers by leading the transition to cloud-native software and as-a-service delivery models. Our inclusive team of dreamers, doers and disruptors push the limits from impossible to possible. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Join us as a Domain Architect in the Autonomous Network domain and be a part of our success journey! In this role, you’ll have an opportunity to shape innovative solutions and make a real impact. As an advisor, you’ll work closely with stakeholders to address their unique needs and translate them into practical, high-value solutions. With a focus on industry best practices and architectural excellence, you'll help customers achieve their business goals with confidence, while growing your expertise in a dynamic, forward-thinking environment. Join us and be a part of something extraordinary! How You Will Contribute And What You Will Learn Develop a Requirement Definition Document (RDD), High-Level Design (HLD), and Low-Level Design (LLD). Stay updated on customer architecture within the dedicated technical area and regional requirements. Apply solution architecture standards, processes, and principles. Define and develop the full scope of solutions, working across different teams and organizations to create effective outcomes. Work effectively in diverse environments, leveraging best practices and industry knowledge to enhance products and services. Serve as an advisor and mentor to team members, guiding projects and tasks. Guide and drive projects with manageable risks and resource requirements or oversee small teams, managing day-to-day operations, resource allocation, and workload distribution. Act as a key troubleshooter and subject matter expert on the Autonomous product portfolio, including fulfillment, assurance, inventory, security, and analytics. Key Skills And Experience You have: Bachelor’s degree in engineering/technology or equivalent with 10+ years of hands-on experience in autonomous networks driving large programs, and should have worked as an Architect/Designer for at least 5 years. Experience in at least one or two domains like Orchestration/fulfillment/Flowone/CDPA/CDFF/NoRC; Assurance/NAC; InventoryUIV, Discovery and Reconciliation domain; SSO/Security product suites/NIAM; Analytics Hands-on experience in Java, Expect scripting, Python, Kubernetes, Microservices, Databases, XML, XSLT, Data Parsing, SNMP, REST, SOAP, CORBA, LDAP, JMS, and FTP. Exposure to Oracle, Postgres, MongoDB, MariaDB, Neo4J, containerization, orchestration tools, agile methodologies It would be nice if you also had: Understanding of 5G Slicing, 5G SA/NSA network, IP/MPLS, Optics, IMS, VoLTE, NFV/SDN, Fixed network Independent, disruptive thinking with a results-oriented mindset and strong communication skills. Ability to work in a fast-paced global environment with cross-cultural teams and customers. About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team As Nokia's growth engine, we create value for communication service providers and enterprise customers by leading the transition to cloud-native software and as-a-service delivery models. Our inclusive team of dreamers, doers and disruptors push the limits from impossible to possible. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less
Posted 1 month ago
6.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Attributes: Adaptability & Agility Thrive in a fast-paced, ever-evolving environment with shifting priorities. Demonstrated ability to quickly learn and integrate new technologies and frameworks. Strong problem-solving mindset with the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark. Define and implement smart data pipelines from RDBMS to Graph Databases . Build and expose APIs using AWS Lambda and ECS-based microservices . Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and scalable code following best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required. Ensure security, compliance, and observability across all development activities. All you need is... Required Skills & Experience Expert-level proficiency in Python with a strong grasp of Object oriented & functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms – AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough