Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
punjab
On-site
As an experienced Node.js Developer, you will play a crucial role in providing technical leadership and guiding team members towards delivering high-quality work within project timelines. While not a Team Lead position, we highly value your initiative in leading and mentoring junior developers. Your primary responsibilities will include mentoring junior developers, offering technical support, and conducting code reviews to maintain code quality and best practices. You will be responsible for managing development workflows and project timelines to ensure timely delivery of high-quality work. Your expertise in Node.js will be essential for building scalable applications and upholding coding best practices across the team. Additionally, you should have a strong understanding of both SQL and NoSQL databases, such as MongoDB, to optimize database performance, integrate data storage solutions, and design efficient database structures. Experience with microservices architecture, including tools like Redis, RabbitMQ, and Kafka, is crucial for leading initiatives in building scalable and resilient systems. You will be required to design and implement RESTful APIs that seamlessly integrate with front-end applications. Ensuring application security, scalability, and performance will be a key focus, including low-latency and high-availability designs. Collaborating with front-end developers and cross-functional teams will be essential to ensure smooth communication and collaboration across all stakeholders. Promoting coding best practices, writing reusable and efficient code, and championing clean code principles will be part of your responsibilities. You will also be expected to facilitate team dynamics, encourage productive collaboration, and maintain a supportive and growth-oriented work environment. To excel in this role, you must have extensive hands-on experience with Node.js and frameworks like ExpressJS, a strong understanding of MongoDB, and experience with both NoSQL and SQL databases. Familiarity with microservices technologies like Redis, RabbitMQ, and Kafka is essential, along with knowledge of cloud platforms, especially AWS S3, and best practices for cloud deployment. Your ability to design, implement, and maintain low-latency, high-availability, and secure applications will be critical for the success of the team. Overall, as a Node.js Developer, you will be a key player in driving technical excellence, fostering collaboration, and ensuring the delivery of high-quality, scalable applications.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Architect, your primary responsibility is to design and manage scalable, secure, and high-performance data architectures that cater to the needs of GEDU and its customers. You will play a crucial role in ensuring that data assets within GEDU are well-structured and managed to facilitate insightful decision-making, data integrity, and aligned with strategic goals. Your key responsibilities will include designing, developing, and maintaining enterprise data architecture, which encompasses data models, database schemas, and data flow diagrams. You will also be tasked with developing a data strategy and roadmap that aligns with GEDU's business objectives while ensuring scalability. Architecting both transactional (OLTP) and analytical (OLAP) databases to guarantee optimal performance and data consistency will also be part of your role. Moreover, you will oversee the integration of disparate data sources into a unified data platform by leveraging ETL/ELT processes and data integration tools. Designing and implementing data warehousing solutions, data lakes, and data marts that enable efficient storage and retrieval of large datasets will also be a critical aspect of your responsibilities. Ensuring proper data governance, including data ownership, security, and privacy controls in compliance with standards like GDPR and HIPAA, will be essential. Collaborating closely with business stakeholders, analysts, developers, and executives to understand data requirements and ensure that the architecture supports analytics and reporting needs will be crucial. You will also collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Guiding the selection of data technologies such as databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure), and analytics tools will be part of your role. Staying updated on emerging data management technologies, trends, and best practices and assessing their potential application within the organization will also be essential. You will be responsible for defining data quality standards and implementing processes to ensure accuracy, completeness, and consistency of data across all systems. Establishing protocols for data security, encryption, backup/recovery to protect data assets, and ensure business continuity will be vital. Additionally, you will lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Providing strategic guidance on data-related projects and initiatives to ensure alignment with the enterprise data strategy will also be part of your responsibilities. With over 7 years of experience in data architecture, data modeling, and database management, you should possess proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must), ETL/ELT processes, and data pipelines will be required. Expertise in Azure cloud data platform is a must, with additional experience in platforms like AWS (Redshift, S3), Azure (Data Lake, Synapse), Google Cloud Platform (BigQuery, Dataproc) being a bonus. Hands-on experience with big data technologies (Hadoop, Spark), distributed systems for large-scale data processing, data warehousing solutions, and BI tools (Power BI, Tableau, Looker) will also be beneficial. Your technical leadership skills should enable you to lead data-driven projects, manage stakeholders effectively, and drive data strategies across the enterprise. Strong programming skills in languages like Python, SQL, R, or Scala will also be necessary for this role. Finally, your pre-sales responsibilities will involve stakeholder engagement, solution development, developing proof of concepts (POCs), client communication, and delivering technical presentations to prospective clients. Ensuring that solutions align with business objectives, meet security and performance benchmarks, and effectively communicate technical requirements to clients and stakeholders will be crucial in the pre-sales process.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
The opportunity presents itself for a hands-on Backend Developer who is adept at solving real-world problems, working with intricate legacy systems, and scaling systems for millions of users. This role offers you the chance to take ownership of core backend services, revamp existing logic, and develop new features from scratch. If you have a deep passion for sports and aspire to contribute to a product utilized daily by numerous players and venues, then we are eager to have you on board. Your responsibilities will include designing, developing, and maintaining scalable backend services utilizing NodeJS. You will be tasked with refactoring and enhancing our legacy codebase to enhance performance, readability, and structure. Collaborating with product, frontend, and design teams to deliver impactful features will be a key part of your role. Additionally, you will be involved in debugging, testing, and monitoring APIs in a real-world production environment. Writing clear documentation and engaging in code reviews will also be crucial. Your assistance will be required to modernize and optimize the backend deployment pipeline on AWS ECS. Requirements for this role include a minimum of 3-5 years of hands-on experience in building backend applications using Node.js. A strong grasp of API design, REST principles, and backend architecture is essential. Familiarity with MongoDB data modeling and Redis usage patterns is highly valued. Experience in working with distributed systems and deployment on AWS is a prerequisite. The ability to refactor code and continually enhance it is imperative. Good communication skills and a collaborative mindset are essential qualities. Exposure to TDD/BDD, performance optimization, and developer tooling will be considered a bonus. Technologies you will work with include Node.js, MongoDB, Redis, and REST APIs. You will also engage with AWS services such as ECS, EC2, CloudWatch, and S3. Proficiency in Git, Linux, Postman, and basic CI/CD is expected, with ongoing improvements in this area. If you believe you possess the necessary skills and drive to excel in this role, please reach out to us at careers@playo.co with your resume. Highlight your motivation for this position and showcase your qualifications and skills to demonstrate why you are the ideal candidate to fulfill this role.,
Posted 2 weeks ago
5.0 - 15.0 years
0 Lacs
tamil nadu
On-site
The Applications Development Technology Senior Lead Analyst is a senior-level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology Team. Your main objective will be to lead applications systems analysis and programming activities. To excel in this role, you should possess natural analytical skills and be an innovative thinker capable of providing efficient and effective solutions to complex Technology and Business problems. Your responsibilities will include leading the integration of functions to meet goals, deploying new products, and enhancing processes. You will analyze complex business processes, system processes, and industry standards to define and develop solutions to high-level problems. Additionally, you will provide expertise in the area of advanced knowledge of applications programming and plan assignments involving large budgets, cross-functional projects, or multiple projects. Developing application methodologies and standards for program analysis, design, coding, testing, debugging, and implementation will also be part of your role. You will be required to utilize advanced knowledge of supported main system flows and comprehensive knowledge of multiple areas to achieve technology goals. Consulting with end users to identify system function specifications and incorporating them into the overall system design will be essential. As a Senior Lead Analyst, you will allocate work and act as an advisor/coach to developers, analysts, and new team members. You will also need to influence and negotiate with senior leaders and communicate with external parties effectively. When making business decisions, you must appropriately assess risks and demonstrate particular consideration for the firm's reputation while safeguarding Citigroup, its clients, and assets. This includes driving compliance with applicable laws, rules, and regulations, adhering to policy, applying sound ethical judgment, and escalating, managing, and reporting control issues with transparency. Qualifications: - 15+ years of application/software development experience - 10+ years of experience with technologies such as JAVA (Core Java, J2EE, Spring Boot RESTful Services), Python, Web services (REST, SOAP), XML, Java Script, Microservices, SOA, etc. - 5+ years of experience with BigData Technologies like Apache Spark, Hive, Hadoop & Storm - Strong understanding of data structures, algorithms, and design patterns - Knowledge of technologies like ELK, Docker, Kubernetes, Azure Cloud, AWS S3, etc. - Familiarity with NOSQL databases like MongoDB, Hbase, Cassandra, etc. - Experience with version control systems (e.g., Git) and CI/CD pipelines - Working experience with financial applications/Finance processes is a plus - Extensive experience working in a multi-cultural environment and delivering results with virtual teams - Demonstrated leadership, project management, and development skills - Strong problem-solving skills with the ability to work independently, multi-task, and take ownership of various analyses or reviews - Traits of "Taking Ownership," "Delivering with Pride," and "Succeeding Together" - Bachelor's degree/University degree or equivalent experience required; Master's degree preferred Citi is an equal opportunity and affirmative action employer.,
Posted 2 weeks ago
3.0 - 8.0 years
6 - 14 Lacs
gurugram
Work from Office
My linkedin linkedin.com/in/yashsharma1608. contract period - 6-12month payroll will be - ASV consulting , my company client - will disclose after 1 round Job location - Gurgaon - onsite(WFO) budget - upto 1lpa/month , depending on last (relevant hike) Exprnce - 3+ JD is About the Role We are looking for a skilled AWS Data Engineer with strong hands-on experience in AWS Glue and AWS analytics services. The candidate will be responsible for designing, building, and optimizing scalable data pipelines and ETL processes that support advanced analytics and business intelligence requirements. Key Responsibilities Design and develop ETL pipelines using AWS Glue, PySpark, and AWS services (Lambda, Step Functions, S3, etc.). Build and maintain data lakes and warehouses using AWS S3, Athena, Redshift, and Glue Data Catalog. Automate data ingestion and transformation from structured & unstructured sources. Monitor, troubleshoot, and optimize pipeline performance and cost efficiency . Collaborate with analysts, data scientists, and business stakeholders to define requirements. Ensure data governance, quality, and security standards. Document data workflows, schemas, and technical solutions. Required Qualifications 3+ years of experience in data engineering , preferably with cloud platforms. Strong experience in AWS Glue (ETL jobs, crawlers, workflows). Proficiency in PySpark, Python, and SQL . Hands-on with AWS services: S3, Lambda, Step Functions, CloudWatch, IAM, Athena, Redshift, Glue Data Catalog . Knowledge of data warehousing, data modeling, and data lakes . Strong in pipeline orchestration and performance tuning . Familiar with DevOps tools (CI/CD, Git) and Agile methodology . Preferred Qualifications AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect . Experience with streaming data (Kinesis, Kafka).
Posted 2 weeks ago
5.0 - 8.0 years
10 - 18 Lacs
hyderabad, bengaluru
Hybrid
Position: Java Developer with AWS Location: Hyderabad (Hybrid Role) Duration : Full Time Role Notice Period : Immediate Joiners Mandatory skill Looking for 06+ Years AWS Lambda, AWS EC2, AWS S3, RESTful APIs, Java, REST API
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
We are searching for a highly skilled and seasoned Senior ETL & Data Streaming Engineer with over 10 years of experience to take on a crucial role in the design, development, and maintenance of our robust data pipelines. The ideal candidate will possess in-depth expertise in batch ETL processes as well as real-time data streaming technologies, along with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is a must. Your responsibilities will include designing, developing, and implementing highly scalable, fault-tolerant, and performant ETL processes using leading ETL tools to extract, transform, and load data from diverse source systems into our Data Lake and Data Warehouse. You will also be tasked with architecting and constructing batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis to facilitate immediate data ingestion and processing requirements. Furthermore, you will need to leverage and optimize various AWS data services such as AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others to develop and manage data pipelines. Collaboration with data architects, data scientists, and business stakeholders to comprehend data requirements and translate them into efficient data pipeline solutions is a key aspect of the role. It will also be essential for you to ensure data quality, integrity, and security across all data pipelines and storage solutions, as well as monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Additionally, you will be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs, and implementing data governance policies and best practices within the Data Lake and Data Warehouse environments. As a mentor to junior engineers, you will contribute to fostering a culture of technical excellence and continuous improvement. Staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming will also be expected. Required Qualifications: - 10+ years of progressive experience in data engineering, focusing on ETL, ELT, and data pipeline development. - Extensive hands-on experience with commercial or open-source ETL tools (Talend). - Proven experience with real-time data ingestion and processing using platforms such as AWS Glue, Apache Kafka, AWS Kinesis, or similar. - Proficiency with AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, and potentially AWS EMR. - Strong background in traditional data warehousing concepts, dimensional modeling, and DWH design principles. - Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. - Strong understanding of relational databases and NoSQL databases. - Experience with version control systems (e.g., Git). - Excellent analytical and problem-solving skills with attention to detail. - Strong verbal and written communication skills for conveying complex technical concepts to diverse audiences. Preferred Qualifications: - Certifications in AWS Data Analytics or related areas.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
As a Node.js Team Lead at Apptunix, you will be responsible for overseeing the development and implementation of cutting-edge solutions using Node.js technology. Your role will involve working closely with a team of in-house experts to create innovative software and mobile applications that meet the needs of our clients. To excel in this position, you should have deep experience working with Node.js and a solid understanding of SQL and NoSQL database systems, particularly MongoDB. Your responsibilities will include building RESTful APIs, integrating user-facing elements, and ensuring the scalability and security of the applications. In addition, you will be expected to write reusable and efficient code, design low-latency and high-availability applications, and implement security and data protection measures. Experience with ExpressJs, AWS S3, and ES6 will be beneficial for this role. If you are passionate about leveraging advanced technologies to drive business growth and delivering exceptional solutions to clients, we encourage you to apply for the Node.js Team Lead position at Apptunix. Join us in our mission to empower startups and enterprise businesses through technology solutions. Apply now and be part of our dynamic team!,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As an Application Support Engineer at JIFFY.ai, you will be responsible for ensuring operational excellence across customer-facing applications and backend services built on a modern microservices architecture. Your role will involve identifying, diagnosing, and resolving production issues, maintaining platform availability, supporting customers in multi-tenant environments, and contributing to Business Continuity Planning (BCP) and Disaster Recovery (DR) readiness. You will act as the first responder for production issues raised by customers or monitoring systems, troubleshoot full-stack issues involving UI, APIs, backend microservices, and AWS cloud services, and participate in incident response, managing escalations, and contributing to root cause analysis and documentation. Collaboration with engineering and DevOps teams to deploy fixes and enhancements will also be a key responsibility. Your duties will include maintaining and improving runbooks, knowledge base articles, and system health dashboards, as well as supporting and monitoring application performance, uptime, and health using tools like Grafana, Prometheus, and OpenSearch. Proactively detecting patterns of recurring issues, recommending long-term solutions or automation, and assisting in post-deployment validations will be part of your role. Additionally, you will enforce budget controls per tenant/app based on customer SLAs and usage and ensure compliance with change control processes during critical deployments and rollback scenarios. To qualify for this role, you should have at least 2-3 years of experience in application support, cloud operations, or site reliability roles. Experience debugging issues in microservices written in Go, Java, Node.js, or Python is essential, along with a strong understanding of web protocols, familiarity with AWS services, competence in Linux system administration, and hands-on experience with monitoring/logging tools. Excellent problem-solving and incident resolution skills, as well as strong written and verbal communication, are also required. Nice-to-have skills include exposure to Kubernetes-based deployments, experience with CI/CD pipelines, scripting skills for automation tasks, and familiarity with incident management frameworks and support SLAs. Joining the team at JIFFY.ai offers you the opportunity to work with modern technologies, solve real-world problems in enterprise automation, and benefit from a learning-focused culture with technical mentorship and career development opportunities. You will collaborate on-site at the Technopark, Trivandrum office, and enjoy a fast-paced work environment.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
As a Mobile App & Web Solutions development agency, Apptunix based in Texas, US, focuses on empowering cutting-edge startups & enterprise businesses to achieve incremental growth through innovative technology solutions. Since its establishment in mid-2013, Apptunix has been dedicated to enhancing client interests and satisfaction by delivering advanced Software and Mobile development solutions. With a team of over 250 in-house experts, Apptunix collaborates closely with clients to develop tailored solutions that meet their specific needs. With over 3.5 years of experience, the ideal candidate for this position should possess deep expertise in Node.js, a thorough understanding of SQL and NoSQL database systems, particularly MongoDB, and familiarity with MVC and stateless APIs. Additionally, applicants should have experience in building RESTful APIs, scaling applications, and addressing security considerations. Proficiency in ExpressJs, AWS S3, and ES6 is required, along with the ability to write reusable, testable, and efficient code. The role also involves designing and implementing low-latency, high-availability applications, ensuring security and data protection, and integrating data storage solutions. This is a full-time position that requires in-person attendance from Monday to Friday.,
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
bengaluru
Work from Office
Design, build, and maintain data pipelines on the AWS platform. Work with AWS services like S3, Glue, EMR, and Redshift. Process and analyze large datasets to support business insights. Ensure data quality, integrity, and security in the data lake. Location - Pan India.
Posted 2 weeks ago
9.0 - 13.0 years
0 Lacs
haryana
On-site
About Markovate: At Markovate, you don't just follow trends, we drive them. We transform businesses through innovative AI and digital solutions that turn vision into reality. Our team harnesses breakthrough technologies to craft bespoke strategies that align seamlessly with our clients" ambitions. From AI Consulting And Gen AI Development To Pioneering AI Agents And Agentic AI, We Empower Our Partners To Lead Their Industries With Forward-thinking Precision And Unmatched Overview. We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modelling. Requirements: - 9+ years of experience in data engineering and data architecture. - Excellent communication and interpersonal skills, with the ability to engage with teams. - Strong problem-solving, decision-making, and conflict-resolution abilities. - Proven ability to work independently and lead cross-functional teams. - Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. - Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. - The candidate must have strong work ethics and trustworthiness. - Must be highly collaborative and team-oriented with a commitment to Responsibilities. Responsibilities: - Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). - Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. - Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). - Develop and maintain bronze, silver, gold data layers using DBT or Coalesce. - Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. - Integrate with metadata and lineage tools like Unity Catalog and Open Metadata. - Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). - Work closely with QA teams to integrate test automation and ensure data quality. - Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. - Document architectures, pipelines, and workflows for internal stakeholders. Experience with: - Cloud platforms such as AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions, Event Grid). - Transformation and ELT tools like Databricks (PySpark), DBT, Coalesce, and Python. - Data ingestion methods including Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. - Data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modelling. - Orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. - Monitoring and logging tools like CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). - Data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. - Version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. - Cloud data platforms, ETL tools, AI/Generative AI concepts and frameworks, data warehousing solutions, big data technologies, SQL, and at least one programming language. Great to have: - Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. - Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). - Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). - Experience with data modeling, data structures, and database design. - Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). - Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). - Proficiency in SQL and at least one programming language. What it's like to be at Markovate: - At Markovate, we thrive on collaboration and embrace every innovative idea. - We invest in continuous learning to keep our team ahead in the AI/ML landscape. - Transparent communication is key, every voice at Markovate is valued. - Our agile, data-driven approach transforms challenges into opportunities. - We offer flexible work arrangements that empower creativity and balance. - Recognition is part of our DNA; your achievements drive our success. - Markovate is committed to sustainable practices and positive community impact. - Our people-first culture means your growth and well-being are central to our mission. - Location: hybrid model 2 days onsite.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a skilled professional with over 5 years of experience in programming with Python, you will be responsible for designing, developing, and implementing performant ETL pipelines using Python API (pySpark) of Apache Spark on AWS EMR. Your role will involve hands-on experience in developing ETL data pipelines, configuring EMR clusters on AWS, and working with AWS S3 object storage from Spark. You should have a strong proficiency in Python, familiarity with functional programming concepts, and data modeling. Your key responsibilities will include understanding Spark's Data frame and API, troubleshooting Spark jobs, and monitoring spark jobs using Spark UI. The ideal candidate should have at least 3 years of hands-on experience in developing ETL data pipelines using pySpark on AWS EMR and a good understanding of Spark's Data frame and API. This is a full-time, permanent position suitable for someone with a solid background in ETL, Python programming, Apache Spark, pySpark, AWS S3, and Data Modeling. The work location is in person, and the application deadline is 22/02/2025. Benefits include the flexibility to work from home, and the schedule consists of day shift, fixed shift, and morning shift. If you have a strong background in ETL, Python, Apache Spark, pySpark, AWS S3, and Data Modeling, we encourage you to apply for this exciting opportunity.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a seasoned Software Development Leader, you will be responsible for leading and mentoring a high-performing team of developers proficient in frontend and backend technologies. Your primary focus will be on fostering their growth and ensuring the successful delivery of projects. Collaborating with cross-functional teams, you will actively participate in designing and architecting robust, scalable, secure, and performant end-to-end project solutions. Your expertise in designing and implementing highly efficient, reliable, and scalable database systems using PostgreSQL and MongoDB will be crucial. Leveraging a wide range of AWS services, including EC2, S3, Lambda, RDS, DynamoDB, and API Gateway, you will build cloud-based application solutions and manage infrastructure effectively. Utilizing serverless architecture, particularly AWS Lambda, you will develop scalable and cost-efficient applications. As the primary technical point of contact for client communications, you will excel in translating complex business requirements into clear technical solutions, ensuring high customer satisfaction. Collaborating closely with developers, architects, and stakeholders, you will execute the technical vision and strategy seamlessly across all development phases. Staying abreast of emerging technologies and industry best practices, you will actively apply them to enhance team efficiency, product quality, and development processes continuously. **Required Skills & Qualifications:** - **Overall Experience:** Minimum of 8 years of experience in software development. - **Leadership Experience:** At least 3 years of experience in a leadership role, managing and guiding a team of developers. - **Backend Development:** Strong expertise in Python, with hands-on experience in frameworks like Django, Flask, FastAPI, or similar. - **Frontend Development:** Proficient in JavaScript, HTML/CSS, and modern frontend frameworks (e.g., React, Angular, or Vue.js). - **Cloud Infrastructure:** Proven proficiency in utilizing a wide array of AWS services, including Lambda, S3, EC2, RDS, DynamoDB, and API Gateway for building cloud-based solutions. - **Database Management:** Strong experience with PostgreSQL and MongoDB, including the ability to design scalable and efficient database architectures. - **Serverless Architecture:** Demonstrated experience in designing and implementing serverless applications, especially using AWS Lambda and other serverless components. - **Communication:** Fluent written and verbal communication skills are essential for effective interaction with both technical and non-technical stakeholders, including clients. **Nice To Have:** - Experience with additional cloud platforms such as Azure or Google Cloud. - Solid knowledge of DevOps practices and CI/CD pipelines. - Experience with Node.js for backend or full-stack development.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
mysore, karnataka
On-site
About ISOCRATES Since 2015, iSOCRATES has been advising, building, and managing mission-critical Marketing, Advertising, and Data technologies, platforms, and processes as the Global Leader in MADTECH Resource Planning and Execution(TM). The company delivers globally proven, reliable, and affordable Strategy and Operations Consulting and Managed Services for marketers, agencies, publishers, and the data/tech providers that enable them. iSOCRATES is staffed 24/7/365 with its proven specialists who save partners money and time while achieving transparent, accountable performance and delivering extraordinary value. The savings stem from a low-cost, focused global delivery model at scale that benefits from continuous reinvestment in technology and specialized training. About MADTECH.AI MADTECH.AI is the Unified Marketing, Advertising, and Data Decision Intelligence Platform purpose-built to deliver speed to value for marketers. At MADTECH.AI, real-time AI-driven insights are made accessible to everyone. Whether you're a global or emerging brand, agency, publisher, or data/tech provider, MADTECH.AI provides a single source of truth, enabling sharper insights that drive better marketing decisions faster and more affordably than ever before. The platform unifies and transforms MADTECH data and centralizes decision intelligence in a single, affordable platform, handling tasks like data wrangling, data model building, proactive problem-solving, and data visualization. Job Description iSOCRATES is seeking a highly skilled and experienced Lead Data Scientist to spearhead the growing Data Science team. The Lead Data Scientist will be responsible for leading the team that defines, designs, reports on, and analyzes audience, campaign, and programmatic media trading data. This role involves collaborating with cross-functional teams and working across various media channels, including digital and offline channels such as display, mobile, video, social, native, and advanced TV/Audio ad products. Key Responsibilities Team Leadership & Management: Leading and mentoring a team of data scientists to drive the design, development, and implementation of data-driven solutions for media and marketing campaigns. Advanced Analytics & Data Science Expertise: Providing hands-on leadership in applying statistical, econometric, and Big Data methods to define requirements, design analytics solutions, analyze results, and optimize economic outcomes. Expertise in modeling techniques like propensity modeling, Media Mix Modeling (MMM), Multi-Touch Attribution (MTA), Recency, Frequency, Monetary (RFM) analysis, Bayesian statistics, and non-parametric methods. Generative AI & NLP: Leading the implementation and development of Generative AI, Large Language Models, and Natural Language Processing (NLP) techniques to enhance data modeling, prediction, and analysis processes. Data Architecture & Management: Architecting and managing dynamic data systems from diverse sources, ensuring effective integration and optimization of audience, pricing, and contextual data for programmatic and digital advertising campaigns. Overseeing the management of DSPs, SSPs, DMPs, and other data systems integral to the ad-tech ecosystem. Cross-Functional Collaboration: Working closely with Product, System Development, Yield, Operations, Finance, Sales, Business Development, and other teams to ensure seamless data quality, completeness, and predictive outcomes across campaigns. Designing and delivering actionable insights, creating innovative, data-driven solutions and reporting tools for use by both iSOCRATES teams and business partners. Predictive Modeling & Optimization: Leading the development of predictive models and analyses to drive programmatic optimization, focusing on revenue, audience behavior, bid actions, and ad inventory optimization (eCPM, fill rate, etc.). Monitoring and analyzing campaign performance, making data-driven recommendations for optimizations across various media channels including websites, mobile apps, and social media platforms. Data Collection & Quality Assurance: Overseeing the design, collection, and management of data, ensuring high-quality standards, efficient storage systems, and optimizations for in-depth analysis and visualization. Guiding the implementation of tools for complex data analysis, model development, reporting, and visualization, ensuring alignment with business objectives. Qualifications - Masters or Ph.D. in Statistics, Engineering, Science, or Business with a strong foundation in mathematics and statistics. - Looking for an experience of 8 to 10 years with at least 5 years of hands-on experience in data science, predictive analytics, media research, and digital analytics, with a focus on modeling, analysis, and optimization within the media, advertising, or tech industry. - At least 3 years of hands-on experience with Generative AI, Large Language Models, and Natural Language Processing techniques. - Minimum 3 years of experience in Publisher and Advertiser Audience Data Analytics and Modeling. - Proficient in data collection, business intelligence, machine learning, and deep learning techniques using tools such as Python, R, scikit-learn, Hadoop, Spark, MySQL, and AWS S3. - Expertise in logistic regression, customer segmentation, persona building, and predictive analytics. - Strong analytical and data modeling skills with a deep understanding of audience behavior, pricing strategies, and programmatic media optimization. - Experience working with DSPs, SSPs, DMPs, and programmatic systems. - Excellent communication and presentation skills, with the ability to communicate complex technical concepts to non-technical stakeholders. - Ability to manage multiple tasks and projects effectively, both independently and in collaboration with remote teams. - Strong problem-solving skills with the ability to adapt to evolving business needs and deliver solutions proactively. - Experience in developing analytics dashboards, visualization tools, and reporting systems. - Background in digital media optimization, audience segmentation, and performance analytics. This is an exciting opportunity to take on a leadership role at the forefront of data science in the digital media and advertising space. If you have a passion for innovation, a strong technical background, and the ability to lead a team toward impactful, data-driven solutions, we encourage you to apply. An interest and ability to work in a fast-paced operation on the analytics and revenue side of our business. Willingness to relocate to Mysuru/Bengaluru.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Cloud Data Developer, you will play a crucial role in bridging the gap between data engineering and cloud-native application development. Your primary responsibility will be to design and maintain scalable data solutions in the cloud by leveraging cutting-edge technologies and modern development practices. You will be tasked with developing data-centric applications using Java Spring Boot and various AWS services. Your expertise will be essential in creating and managing efficient data pipelines, ETL processes, data workflows, and orchestrating data processing systems. Real-time data processing solutions using AWS SNS/SQS and AWS Pipes will be a key part of your role. Additionally, you will be responsible for optimizing data storage solutions using AWS Aurora and S3, as well as managing data discovery and metadata using AWS Glue Data Catalog. Your skills will also be utilized to create search and analytics solutions using AWS OpenSearch Service and to design event-driven architectures for data processing. To excel in this role, you must have a strong proficiency in Java and the Spring Boot framework. Extensive experience with various AWS data services such as AWS EMR, AWS Glue Data Catalog, AWS OpenSearch Service, AWS Aurora, and AWS S3 is essential. Your expertise in data pipeline development using tools like Apache NiFi, AWS MWAA, AWS Pipes, and AWS SNS/SQS will be highly valuable in fulfilling the technical requirements of this position.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a professional services firm affiliated with KPMG International Limited, KPMG entities in India have been serving national and international clients since August 1993. With offices located across India in cities like Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, our team of professionals leverages a global network of firms while maintaining expertise in local laws, regulations, markets, and competition. We are committed to delivering rapid, performance-based, industry-focused, and technology-enabled services that showcase our extensive knowledge of global and local industries, as well as our deep understanding of the Indian business environment. In order to achieve this, we are looking for individuals with expertise in the following technologies and tools: Python, SQL, AWS Lambda, AWS Glue, AWS RDS, AWS S3, AWS Athena, AWS Redshift, AWS EventBridge, PySpark, Snowflake, GIT, Azure DevOps, JIRA, Cloud Computing, Agile methodologies, Automation, and Talend. If you are passionate about working in a dynamic environment that values equal employment opportunities and embraces diverse perspectives, we invite you to join our team at KPMG in India.,
Posted 2 weeks ago
5.0 - 10.0 years
40 - 45 Lacs
pune, gurugram, bengaluru
Work from Office
Notice: - Immediate Joiners Only Design, develop, and maintain SQL Server Analysis Services (SSAS) models Create and manage OLAP cubes to support business intelligence reporting Develop and implement multidimensional and tabular data models Optimize the performance of SSAS solutions for efficient query processing Integrate data from various sources into SQL Server databases and SSAS models Preferably knowledge on AWS S3 and SQL server Polybase Location - Bangalore, Pune, Gurgaon, Noida, Hyderabad
Posted 2 weeks ago
7.0 - 9.0 years
27 - 30 Lacs
bengaluru
Work from Office
We are seeking experienced Data Engineers with over 7 years of experience to join our team at Intuit. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 7+ years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 weeks ago
7.0 - 9.0 years
25 - 32 Lacs
chennai, bengaluru
Work from Office
Hiring Cloud Engineers for an 8-month contract role based in Chennai or Bangalore with hybrid/remote flexibility. The ideal candidate will have 8+ years of IT experience, including 4+ years in AWS cloud migrations, with strong hands-on expertise in AWS MGN, EC2, EKS, Terraform, and scripting using Python or Shell. Responsibilities include leading lift-and-shift migrations, automating infrastructure, migrating storage to EBS, S3, EFS, and modernizing legacy applications. AWS/Terraform certifications and experience in monolithic and microservices architectures are preferred
Posted 2 weeks ago
9.0 - 14.0 years
15 - 20 Lacs
hyderabad, pune, bengaluru
Work from Office
Notice - Immediate to 15 days Requirements:- Languages: Java and Golang (mandatory) Technologies: Deep expertise in Flyte OSS and its extensibility Experience with cloud-native development, particularly AWS S3, KMS, and potentially VAST S3 Proficiency in containerization (Docker) and Kubernetes Experience with ML infrastructure components (Model Registry, Feature Platforms, GPU scheduling) Knowledge of security best practices for data access (pre-signed URLs, Vault) Experience with UI/backend integration for orchestration platforms Methodology:Proven ability to diagnose and resolve complex technical debt and architectural gaps Strong problem-solving skills for integrating disparate systems Experience in delivering production-ready features in a fast-paced environment
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Java Developer with 3 to 6 years of experience, your primary responsibility will be to translate application storyboards and use cases into high-quality, efficient code. You will design, develop, and maintain clean, reusable Java code while taking full ownership of modules to ensure timely deployment to production. It will be essential for you to optimize application performance, quality, and responsiveness, as well as identify and resolve bottlenecks, bugs, and technical issues. Maintaining high standards of code quality, organization, and automation will also be a key part of your role. Additionally, you will need to write well-structured, testable, and efficient code and investigate new technologies and approaches, presenting them for architectural review. Participation in code reviews and providing constructive feedback to peers, along with staying current with the latest technologies and trends, will be expected from you. You should possess a strong proficiency in Core Java, including OOP, Collections, Threads, Regular Expressions, and Exception Handling. A solid understanding of object-oriented programming and the ability to write clean, readable Java code are crucial. Knowledge of scalable application design principles, along with strong experience in Relational Databases, SQL, and ORM technologies such as MySQL and Hibernate, is required. Proficiency in software design and development using Java, J2EE, Spring Boot, Spring Security, and Hibernate is essential. Experience with test-driven development and familiarity with CI/CD processes for build and deployment will be advantageous. Key requirements for this role include experience with Java 8/Java 11, expertise in the Spring Framework, and hands-on experience with relational databases. You should also have proficiency in building scalable REST APIs capable of handling 20k+ simultaneous users, coupled with a willingness to work with new technologies and strong communication skills. Desirable skills include experience with React JS and Javascript frameworks, knowledge of Microservices architecture, and familiarity with Redis, AWS S3, AWS Lambda, and NoSQL databases. CI/CD experience is a plus. Brilworks is committed to supporting your growth in these areas where necessary!,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Technical Lead Full Stack with 8 to 11 years of experience, you will play a crucial role in guiding a small engineering team and leading backend development using Go, Rust, TypeScript, and Python (Django). You should be proficient in working with front-end technologies like React JS as well. Your responsibilities will include designing and building microservices and RESTful APIs, maintaining front-end using React JS, and working with AWS services like S3 and RDS. Additionally, you will be responsible for ensuring system security using tools like Hashicorp Vault and FusionAuth, managing data flows with tools like Kafka, Redis, and GraphQL, monitoring performance with Datadog and Grafana, and supporting CI/CD and automation efforts. You will be expected to review code, provide guidance to team members, keep documentation up to date, and collaborate closely with front-end, DevOps, and product teams. Your role will also involve planning and building high-quality software, making technical decisions, promoting clean and maintainable code practices, and aligning the team with both technical and business goals. Being hands-on when necessary and encouraging best practices will be essential in this position. The primary skills required for this role include proficiency in languages such as Go, Rust, Python (Django), TypeScript, and JavaScript, along with expertise in React JS and HTML/CSS for frontend development. Knowledge of cloud services like AWS (S3, RDS), GitHub, Nginx, and experience in building microservices, REST APIs, and Redis architecture are also important. Familiarity with monitoring tools like Datadog, Grafana, Hashicorp Vault, FusionAuth, Kafka, GraphQL, and CI/CD workflows is a plus. Experience with browser testing, UI testing tools like Cypress, and building UI components from scratch will be beneficial for this role. If you are passionate about leading technical discussions, designing and building backend systems and services, and contributing to the development of reliable and scalable software, this role offers an exciting opportunity to showcase your skills and expertise in a dynamic and collaborative environment.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
The role of Senior Data Engineer involves designing, building, and maintaining ETL/ ELT pipelines. You will be responsible for integrating data from various sources to generate comprehensive insights. Additionally, you will manage Data warehouse solutions, design data models, create ETL processes, and ensure data quality. Performing exploratory data analysis to troubleshoot issues and collaborating with clients is also part of your responsibilities. Mentoring junior team members and offering guidance will be expected from you. In terms of technical skills, proficiency in Python, Pyspark, SQL, Data Warehouse, ETL, Data Modelling, and building ETL Pipelines is crucial. Hands-on experience with tools like Databricks, Redshift, ADF, and cloud services such as Azure/AWS S3, Glue, Lambda, CloudWatch, and Athena is required. Knowledge of Data operations, quality, data governance, SFDC, and Waterfall/Agile methodology is essential. Familiarity with the Pharma domain and life sciences commercial data operations is also preferred. To qualify for this position, you should hold a Bachelor's or Master's degree in Engineering, MCA, or equivalent field. A minimum of 4-6 years of experience as a Data Engineer, particularly working with Pharma syndicated data like IQVIA, Veeva, Symphony, Claims, CRM, Sales, etc., is necessary. You should possess strong communication skills, analytical abilities, problem-solving skills, and demonstrate high motivation, good work ethic, maturity, self-organization, and personal initiative. Collaborative teamwork and support to the team are crucial aspects of this role. Ideally, this position is based in Hyderabad, India.,
Posted 2 weeks ago
6.0 - 10.0 years
10 - 17 Lacs
bengaluru
Work from Office
Strong Application Development work experience - Agile environment preferred Solid application design, coding, testing, maintenance and debugging skills Experience with Junit and Cucumber testing. Experience with APM Monitoring tools and logging tools like Splunk Proficiency with JIRA, Confluence (preferred). Expertise in development using Core Java, J2EE, XML, Web Services/SOA and used Java. frameworks - Spring, spring batch,Spring-boot, JPA, REST, MQ. Knowledgeable in developing RESTful micro services with technical stack, hands on experience in AWS Working with GIT/Bitbucket, Maven, Gradle, Jenkins tools to build and deploy code deployment to production environments. Hands on for CI/CD Kubernetes Qualification Strong Application Development work experience - Agile environment preferred Solid application design, coding, testing, maintenance and debugging skills Experience with Junit and Cucumber testing. Experience with APM Monitoring tools and logging tools like Splunk Proficiency with JIRA, Confluence (preferred). Expertise in development using Core Java, J2EE, XML, Web Services/SOA and used Java. frameworks - Spring, spring batch,Spring-boot, JPA, REST, MQ. Knowledgeable in developing RESTful micro services with technical stack, hands on experience in AWS Working with GIT/Bitbucket, Maven, Gradle, Jenkins tools to build and deploy code deployment to production environments. Hands on for CI/CD Kubernetes Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |