Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Key Responsibilities: • Develop and implement machine learning models and algorithms. • Work closely with project stakeholders to understand requirements and translate them into deliverables. • Utilize statistical and machine learning techniques to analyze and interpret complex data sets. • Stay updated with the latest advancements in AI/ML technologies and methodologies. • Collaborate with cross-functional teams to support various AI/ML initiatives. Qualifications: • Bachelor’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. • Strong understanding of machine learning , deep learning and Generative AI concepts. Preferred Skills: • Experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, Deep Learning stack, NLP using python • Strong knowledge and experience in Generative AI/ LLM based development. • Strong experience working with key LLM models APIs (e.g. AWS Bedrock, Azure Open AI/ OpenAI) and LLM Frameworks (e.g. LangChain, LlamaIndex). • Experience with cloud infrastructure for AI/Generative AI/ML on AWS, Azure. • Expertise in building enterprise grade, secure data ingestion pipelines for unstructured data – including indexing, search, and advance retrieval patterns. • Knowledge of effective text chunking techniques for optimal processing and indexing of large documents or datasets. • Proficiency in generating and working with text embeddings with understanding of embedding spaces and their applications in semantic search and information. retrieval. • Experience with RAG concepts and fundamentals (VectorDBs, AWS OpenSearch, semantic search, etc.), Expertise in implementing RAG systems that combine knowledge bases with Generative AI models. • Knowledge of training and fine-tuning Foundation Models (Athropic, Claud , Mistral, etc.), including multimodal inputs and outputs. • Proficiency in Python, TypeScript, NodeJS, ReactJS (and equivalent) and frameworks. (e.g., pandas, NumPy, scikit-learn), Glue crawler, ETL • Experience with data visualization tools (e.g., Matplotlib, Seaborn, Quicksight). • Knowledge of deep learning frameworks (e.g., TensorFlow, Keras, PyTorch). • Experience with version control systems (e.g., Git, CodeCommit). Good to have Skills • Knowledge and Experience in building knowledge graphs in production. • Understanding of multi-agent systems and their applications in complex problemsolving scenarios
Posted 5 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Overview: We are looking for a MERN Stack Developer with 4+ years of hands-on experience to join our development team. You will be responsible for building and maintaining full-stack web applications using MongoDB, Express.js, React.js, and Node.js. Key Responsibilities: Design, develop, and maintain scalable and high-performance web applications using the MERN stack Write clean, maintainable, and efficient code following best practices Build RESTful APIs with Node.js and Express.js Design and manage NoSQL schemas in MongoDB Develop responsive front-end interfaces using React.js (with Redux or Context API) Optimize components for maximum performance across a range of devices and browsers Collaborate with UI/UX designers, QA engineers, and backend teams Perform code reviews and mentor junior developers if needed Integrate third-party APIs and services when required Troubleshoot and debug issues as they arise in development and production environments Must-Have Skills: Strong proficiency in JavaScript. Hands-on experience with MongoDB, Express.js, React.js, and Node.js Understanding of component-based architecture and state management using Redux/Context API Experience with RESTful API design and integration Familiarity with MongoDB design patterns, aggregation, indexing, etc. Experience with Git and version control best practices Knowledge of HTML5, CSS3, and modern front-end build pipelines (Webpack, Babel) Experience with Postman, Swagger, or similar tools for API testing Familiarity with asynchronous programming and event-driven architecture Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field 4+ years of professional experience in full-stack development using the MERN stack
Posted 5 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Senior AI Engineer Location: Bangalore, Gurugram Experience: 4-9 Years Role Overview We are seeking highly skilled AI Engineers to lead the design and delivery of cutting-edge LLM-powered , agentic AI solutions. The role requires deep expertise in RAG pipelines , prompt engineering , and scalable deployment of AI systems in production environments. Key Responsibilities Design, develop, and optimize Retrieval-Augmented Generation (RAG) systems, including: Data ingestion pipelines Vector embedding & indexing Hybrid retrieval and grounded generation Work hands-on with LLM APIs such as OpenAI, FILxGPT, and other foundation models. Architect intelligent, agentic workflows that combine retrieval, generation, and tool execution. Engineer and iterate on high-impact prompts and prompt chains for complex tasks. Drive model governance, including versioning, monitoring, and bias/security auditing. Collaborate across teams to scale AI services in production-grade environments. Mentor junior developers and support knowledge sharing within the AI/ML team. Required Skills & Qualifications 4–9 years of experience in AI/ML Engineering or Data Science 2+ years of hands-on work with LLMs, RAG architectures, and agentic systems Strong understanding and implementation experience in: Prompt Engineering Vector Databases (e.g., FAISS, Pinecone, Weaviate) RAG architectures Proficiency in: Python Cloud Platforms (AWS preferred) Containerization: Docker, Kubernetes CI/CD tools for model and API deployments Excellent communication and stakeholder management skills Strong ability to guide and mentor junior team members
Posted 5 days ago
0.0 - 2.0 years
4 - 7 Lacs
Gurugram, Haryana
Remote
We are seeking a skilled 2+ years Backend Developer with strong expertise in Node.js and MS SQL Server to join our growing development team. The ideal candidate will be responsible for building and maintaining scalable, high-performance backend services and APIs to support our web and mobile applications. Key Responsibilities: · Design, develop, and maintain backend services and RESTful APIs using Node.js · Write efficient SQL queries, stored procedures, and optimize database performance on MS SQL Server · Integrate third-party APIs and data sources · Collaborate with frontend developers, product managers, and QA teams to deliver high-quality software · Ensure application performance, reliability, and security · Debug and troubleshoot issues across the stack · Participate in code reviews and maintain code quality standards · Document technical solutions and contribute to system architecture discussions Required Skills & Qualifications: · Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience) · Strong proficiency in Node.js and its frameworks (e.g., Express.js) · In-depth experience with MS SQL Server including schema design, indexing, and query optimization · Solid understanding of RESTful API design principles · Familiarity with asynchronous programming and event-driven architecture · Experience with version control systems like Git · Understanding of software development best practices, including Agile methodologies · Ability to write clean, maintainable, and testable code · Strong problem-solving and communication skills Job Types: Full-time, Permanent Pay: ₹450,000.00 - ₹700,000.00 per year Benefits: Paid sick time Paid time off Provident Fund Work from home Education: Bachelor's (Required) Experience: Node.js: 2 years (Required) Location: Gurgaon, Haryana (Required) Work Location: In person
Posted 5 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are seeking a highly experienced AWS Data Solution Architect to lead the design and implementation of scalable, secure, and high-performance data architectures on the AWS cloud. The ideal candidate will have a deep understanding of cloud-based data platforms, analytics, and best practices for optimizing data pipelines and storage. You will work closely with data engineers, business stakeholders, and cloud architects to deliver robust data solutions. Key Responsibilities: 1. Architecture Design and Planning: Design scalable and resilient data architectures on AWS that include data lakes, data warehouses, and real-time processing. Architect end-to-end data solutions leveraging AWS services such as S3, Redshift, RDS, DynamoDB, Glue, and Lake Formation. Develop multi-layered security frameworks for data protection and governance. 2. Data Pipeline Development: Build and optimize ETL/ELT pipelines using AWS Glue, Data Pipeline, and Lambda. Integrate data from various sources like RDBMS, NoSQL, APIs, and streaming platforms. Ensure high availability and real-time processing capabilities for mission-critical applications. 3. Data Warehousing and Analytics: Design and optimize data warehouses using Amazon Redshift or Snowflake. Implement data modeling, partitioning, and indexing for optimal performance. Create analytical models to drive business insights and data-driven decision-making. 4. Real-time Data Processing: Implement real-time data processing using AWS Kinesis, Kafka, or MSK. Architect solutions for event-driven architectures with Lambda and EventBridge. 5. Security and Compliance: Implement best practices for data security, encryption, and access control using IAM, KMS, and Lake Formation. Ensure compliance with regulatory standards like GDPR, HIPAA, and CCPA. 6. Monitoring and Optimization: Monitor performance, optimize costs, and enhance the reliability of data pipelines and storage. Set up observability with AWS CloudWatch, X-Ray, and CloudTrail. Troubleshoot issues and ensure business continuity with automated recovery mechanisms. 7. Documentation and Best Practices: Create detailed architecture diagrams, data flow mappings, and documentation for reference. Establish best practices for data governance, architecture design, and deployment. 8. Collaboration and Leadership: Work closely with data engineers, application developers, and DevOps teams to ensure seamless integration. Act as a technical advisor to business stakeholders for cloud-based data solution Regulatory Compliance Reporting Experience The architect should be able to resolve complex challenges due to the strict regulatory environment in India and the need to balance compliance with operational efficiency. Key complexities include: a) Building data segregation and Access Control capability: This requires in-depth understanding of data privacy laws, Amazon’s global data architecture, and the ability to design systems that can segregate and control access to sensitive payment data without compromising functionality. b) Integrating diverse data sources into Secure Redshift Cluster (SRC) data which involves working with multiple teams and systems, each with its own data structure and transfer protocols. c) Instrumenting additional UPI data elements collaborating with UPI tech teams and a deep understanding of UPI transaction flows to ensure accurate and compliant data capture. d) Automating Law Enforcement Agency (LEA) and Financial Intelligence Unit (FIU) reporting: This involves creating secure, automated pipelines for highly sensitive data, ensuring accuracy and timeliness while meeting strict regulatory requirements. The Architect will be extending from India-specific solutions to serving worldwide markets. Complexities include: a) Designing a unified data storage and compute architecture requiring harmonizing diverse tech stacks and data logging practices across multiple countries while considering data sovereignty laws and cost implications of cross-border data transfers. b) Setting up comprehensive datamarts covering metrics and dimensions involving standardizing metric definitions across markets, ensuring data consistency, and designing for scalability to accommodate future growth. c) Enabling customer segmentation across power-up programs that requires integrating data from diverse programs while maintaining data integrity and respecting country-specific data usage regulations. d) Managing time zone challenges :Synchronizing data across multiple time zones requires innovative solutions to ensure timely data availability without compromising completeness or accuracy. e) Navigating regulatory complexities: Designing systems that comply with varying and evolving data regulations across multiple countries while maintaining operational efficiency and flexibility for future changes.
Posted 5 days ago
0.0 - 3.0 years
0 - 0 Lacs
Eramalloor, Kerala
On-site
Location: Kochi, Kerala (On-Site) Full-Time | Experience: 1–3 Years About Us: We are a fast-growing women's fashion brand with a strong online presence. We're looking for a smart, skilled, and proactive Digital Marketing Executive to manage and grow our Shopify-based website, drive SEO, and run paid ads (Meta & Google). Key Responsibilities: Shopify Website Maintenance: Update product listings, banners, collection pages Manage backend operations like discounts, inventory syncs, basic HTML edits Ensure site speed, mobile responsiveness, and error-free UX SEO Management: Keyword research, on-page SEO (meta tags, product descriptions, URLs) Technical SEO audits and improvements (page speed, indexing, etc.) Monthly performance tracking via Google Search Console & Analytics Ads & Campaigns: Run and optimize Meta (Facebook & Instagram) ad campaigns Set up Google Search, Display, and Shopping ads A/B testing, pixel & conversion tracking, budget optimization Reporting: Weekly performance reports across SEO and Ads ROAS tracking for campaigns Suggestions for growth based on analytics Who We’re Looking For: 1–3 years of experience in digital marketing (e-commerce preferred) Hands-on with Shopify admin panel Proficiency in Google Ads, Meta Ads Manager, Google Analytics, GSC Knowledge of SEO tools (Ahrefs, SEMrush, etc.) is a bonus Basic understanding of design tools like Canva or Photoshop is a plus Should be proactive, responsible, and performance-driven How to Apply: Send your updated CV, portfolio/campaign reports (if available), and expected salary to: career@swapnawedding@gmail.com WhatsApp: 7736898055 Office Location: Eramaloor, Kochi – Kerala (Prefer candidates nearby or willing to relocate) Job Types: Full-time, Part-time, Permanent Pay: ₹15,000.00 - ₹30,000.00 per month Work Location: In person
Posted 5 days ago
0.0 - 3.0 years
0 Lacs
Gurugram, Haryana
On-site
Job ID: 1924 Location: Fully On-Site, Gurgaon, Haryana, IN Job Family: Financial Services Job Type: Permanent Employment Type: Full Time About Us Innovation. Sustainability. Productivity. This is how we are Breaking New Ground in our mission to sustainably advance the noble work of farmers and builders everywhere. With a growing global population and increased demands on resources, our products are instrumental to feeding and sheltering the world. From developing products that run on alternative power to productivity-enhancing precision tech, we are delivering solutions that benefit people – and they are possible thanks to people like you. If the opportunity to build your skills as part of a collaborative, global team excites you, you’re in the right place. Grow a Career. Build a Future! Be part of this company at the forefront of agriculture and construction, that passionately innovates to drive customer efficiency and success. And we know innovation can’t happen without collaboration. So, everything we do at CNH Industrial is about reaching new heights as one team, always delivering for the good of our customers. Job Purpose Timely processing of contract servicing activities for retail operations Ensure that all processing complies with internal policies and conditions. Key Responsibilities Process UCC terminations at end of contract. Indexing loan documents in document management system. Responding to different type of customer queries Processing vendor invoicesVerify retail loan contract agreements Performing customer demographic changes Making changes to payment schedules Ensure all fields in the systems are correctly entered. Liaise with stakeholders where errors and omissions are noted Experience Required Minimum of 2-3 years of experience in operations in banks, NBFCs Experience of working in activities like contract servicing, invoice processing, document verification, data entry Good communication skills in English and ability to clearly communicate with all peers and management. Attention to detail. Ability to work under pressure. Ability to work independently and proactively. Preferred Qualifications Bachelor’s degree Language Requirement(s): English What We Offer We offer dynamic career opportunities across an international landscape. As an equal opportunity employer, we are committed to delivering value for all our employees and fostering a culture of respect. At CNH, we understand that the best solutions come from the diverse experiences and skills of our people. Here, you will be empowered to grow your career, to follow your passion, and help build a better future. To support our employees, we offer regional comprehensive benefits, including: Flexible work arrangements Savings & Retirement benefits Tuition reimbursement Parental leave Adoption assistance Fertility & Family building support Employee Assistance Programs Charitable contribution matching and Volunteer Time Off
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
ABOUT TRIBUTE TECHNOLOGY: At Tribute Technology, we make end-of-life celebrations memorable, meaningful, and effortless through thoughtful and innovative technology solutions. Our mission is to help communities around the world celebrate life and pay tribute to those we love. Our comprehensive platform brings together software and technology to provide a fully integrated experience for all users, whether that is a family, a funeral home, or an online publisher. We are the market leader in the US and Canada, with global expansion plans and a growing international team of more than 400 individuals in the US, Canada, Philippines, Ukraine and India. ABOUT YOU: Join our dynamic and innovative team as a Senior Backend Developer (Ruby) and embark on an exciting journey of transforming a billion-dollar startup into a cutting-edge micro-services, domain-driven design powerhouse. As a key member of our engineering team, you will play a crucial role in building the future of our company. ESSENTIAL DUTIES AND RESPONSIBILITIES: Lead the design, development, and maintenance of highly scalable and robust backend services and applications using Ruby. Architect and implement backend solutions, making strategic architectural decisions that align with business goals and scalability requirements. Mentor and guide junior and mid-level developers, providing technical leadership, code reviews, and best practices. Drive the adoption of design patterns, SOLID principles, and RESTful APIs to ensure a maintainable, extensible, and scalable codebase. Collaborate with cross-functional teams to gather requirements, design solutions, and deliver high-quality software products. Design and optimize database schemas and queries for performance, scalability, and reliability, utilizing Active Record for database interactions. Champion a culture of test-driven development (TDD) and write comprehensive unit tests using frameworks like RSpec to ensure code coverage and maintainability. Make architectural decisions regarding technology stack, frameworks, and third-party integrations, considering factors such as performance, security, and maintainability. Lead technical discussions and provide insights into emerging technologies, industry trends, and best practices. Troubleshoot complex backend issues, perform root cause analysis, and provide timely resolutions. Ensure adherence to coding standards, best practices, and security guidelines. Collaborate with DevOps and infrastructure teams to ensure smooth deployment and operation of backend services. Participate in Agile development processes, including sprint planning, backlog grooming, and retrospectives. REQUIRED SKILLS: Bachelor's degree in Computer Science, Software Engineering, or a related field. 5-8 years of experience in backend development, specifically using Ruby. Expertise in object-oriented programming concepts, design patterns, and SOLID principles. Strong understanding of relational databases, particularly PostgreSQL, and experience in optimizing database performance. Experience with ORM frameworks like Active Record. Experience designing and consuming RESTful APIs. Experience with testing tools such as RSpec, Capybara, MiniTest, and understanding of Test-Driven Development (TDD). Familiarity with background job processing using tools like Sidekiq, Resque, or Delayed::Job. Understanding of caching mechanisms (e.g., Memcached, Redis) to optimize performance. Experience with deploying applications using Heroku or AWS. Familiarity with continuous integration/continuous deployment (CI/CD) pipelines. Ensuring code quality, reliability, and maintainability through careful design and testing. Ability to optimize database queries, and experience with database migrations and indexing. Deep understanding of web security practices (e.g., protecting against XSS, CSRF, SQL injection), and experience with OAuth, JWT, and SSO. Familiarity with profiling and monitoring tools like New Relic, Skylight, or Scout to identify and improve application performance. Solid understanding of front-end technologies to bridge the gap between back-end and user interface. Excellent problem-solving skills and attention to detail. Strong leadership and mentoring skills. Excellent communication and collaboration skills. Ability to make strategic technical decisions and drive architectural discussions. PREFFERED SKILLS: Experience with microservices architecture and distributed systems. Knowledge of cloud platforms such as AWS or Azure. Experience with front-end technologies like JavaScript, jQuery, or react to manage full-stack development. Ability to collaborate with front-end developers and ensure seamless API integrations. Hands-on experience with AWS, Google Cloud, or Heroku for deploying and scaling applications. Experience with CI/CD pipelines, containerization tools like Docker, and orchestration tools like Kubernetes. Experience in Agile methodologies such as Scrum or Kanban, including sprint planning, backlog grooming, and task assignment. Experience with tools like GitHub Actions, Jenkins, GitLab CI, or CircleCI for automating deployments. Understanding of software security best practices. Active participation in the developer community through conferences, meetups, or open-source contributions. BENEFITS: Competitive salary Fully remote across India An outstanding collaborative work environment Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the position.
Posted 5 days ago
4.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Data Engineer We at Pine Labs are looking for those who share our core belief - Every Day is Game day. We bring our best selves to work each day to realize our mission of enriching the world through the power of digital commerce and financial services. Role Purpose We are looking for skilled Data Engineers with 4-12 years of experience to join our growing team. You will design, build, and optimize real-time and batch data pipelines, leveraging AWS cloud technologies and Apache Pinot to enable high-performance analytics for our business. This role is ideal for engineers who are passionate about working with large-scale data and real-time processing. Responsibilities We Entrust You With Data Pipeline Development : Build and maintain robust ETL/ELT pipelines for batch and streaming data using tools like Apache Spark, Apache Flink, or AWS Glue. Develop real-time ingestion pipelines into Apache Pinot using streaming platforms like Kafka or Kinesis. Real-Time Analytics Configure and optimize Apache Pinot clusters for sub-second query performance and high availability. Design indexing strategies and schema structures to support real-time and historical data use cases. Cloud Infrastructure Management Work extensively with AWS services such as S3, Redshift, Kinesis, Lambda, DynamoDB, and CloudFormation to create scalable, cost-effective solutions. Implement infrastructure as code (IaC) using tools like Terraform or AWS CDK. Performance Optimization Optimize data pipelines and queries to handle high throughput and large-scale data efficiently. Monitor and tune Apache Pinot and AWS components to achieve peak performance. Data Governance & Security Ensure data integrity, security, and compliance with organizational and regulatory standards (e.g., GDPR, SOC2). Implement data lineage, access controls, and auditing mechanisms. Collaboration Work closely with data scientists, analysts, and other engineers to translate business requirements into technical solutions. Collaborate in an Agile environment, participating in sprints, standups, and retrospectives. Relevant Work Experience 4-12 years of hands-on experience in data engineering or related roles. Proven expertise with AWS services and real-time analytics platforms like Apache Pinot or similar technologies (e.g., Druid, ClickHouse). Proficiency in Python, Java, or Scala for data processing and pipeline development. Strong SQL skills and experience with both relational and NoSQL databases. Hands-on experience with streaming platforms such as Apache Kafka or AWS Kinesis. Familiarity with big data tools like Apache Spark, Flink, or Airflow. Strong problem-solving skills and a proactive approach to challenges. Excellent communication and collaboration abilities in cross-functional teams. Preferred Qualifications Experience with data lakehouse architectures (e.g., Delta Lake, Iceberg). Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to monitoring tools like Prometheus, Grafana, or CloudWatch. Familiarity with data visualization tools like Tableau or Superset. What We Offer Competitive compensation based on experience. Flexible work environment with opportunities for growth. Work on cutting-edge technologies and projects in data engineering and analytics. What We Value In Our People You take the shot : You Decide Fast and You Deliver Right You are the CEO of what you do: you show ownership and make things happen You own tomorrow : by building solutions for the merchants and doing the right thing You sign your work like an artist: You seek to learn and take pride in the work you do (ref:hirist.tech)
Posted 5 days ago
0.0 - 4.0 years
0 Lacs
chennai, tamil nadu
On-site
The role involves reviewing documents for accuracy and completeness, ensuring compliance with established standards and procedures. You will be responsible for converting paper documents into digital formats through scanning and indexing, assigning identifying information for easy retrieval. Accurate data entry and management in document management systems and databases is a key part of the job. You will be required to perform quality checks on processed documents to ensure accuracy and completeness. Maintaining organized files, both physical and digital, to ensure easy access and retrieval is essential. Collaboration with teams to identify areas for process improvement and providing support to colleagues is also expected. Operating and maintaining document processing equipment, such as scanners, copiers, and related machinery, is a part of the role. This is a full-time, permanent position suitable for fresher candidates. The work location is in person. For further details or to apply for the job, please contact the employer at +91 8610458898.,
Posted 5 days ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Aeris: For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Built from the ground up for IoT and road-tested at scale, Aeris IoT Services are based on the broadest technology stack in the industry, spanning connectivity up to vertical solutions. As veterans of the industry, we know that implementing an IoT solution can be complex, and we pride ourselves on making it simpler. Our company is in an enviable spot. We’re profitable, and both our bottom line and our global reach are growing rapidly. We’re playing in an exploding market where technology evolves daily and new IoT solutions and platforms are being created at a fast pace. A few things to know about us: We put our customers first . When making decisions, we always seek to do what is right for our customer first, our company second, our teams third, and individual selves last We do things differently. As a pioneer in a highly competitive industry that is poised to reshape every sector of the global economy, we cannot fall back on old models. Rather, we must chart our own path and strive to out-innovate, out-learn, out-maneuver and out-pace the competition on the way We walk the walk on diversity. We’re a brilliant and eclectic mix of ethnicities, religions, industry experiences, sexual orientations, generations and more – and that’s by design. We see diverse perspectives as a core competitive advantage Integrity is essential. We believe in doing things well – and doing them right. Integrity is a core value here: you’ll see it embodied in our staff, our management approach and growing social impact work (we have a VP devoted to it). You’ll also see it embodied in the way we manage people and our HR issues: we expect employees and managers to deal with issues directly, immediately and with the utmost respect for each other and for the Company We are owners. Strong managers enable and empower their teams to figure out how to solve problems. You will be no exception, and will have the ownership, accountability and autonomy needed to be truly creative Job Title: Senior Oracle Database Administrator (DBA) – GCP Location: Noida, India We are seeking a highly skilled and experienced Senior Oracle DBA to manage and maintain our critical Oracle 12c, 18c, 19c, 21c single instance with DG and RAC databases, hosted on Google Cloud Platform (GCP). The ideal candidate will possess deep expertise in Oracle database administration, including installation, configuration, patching, performance tuning, security, and backup/recovery strategies within a cloud environment. They will also have expertise and experience optimizing the underlying operating system and database parameters for maximum performance and stability. Responsibilities: Database Administration: Install, configure, and maintain Oracle 12c, 18c, 19c, 21c single instance with DG and RAC databases on GCP Compute Engine. Implement and manage Oracle Data Guard for high availability and disaster recovery, including switchovers, failovers, and broker configuration. Perform database upgrades, patching, and migrations. Develop and implement backup and recovery strategies, including RMAN configuration and testing. Monitor database performance and proactively identify and resolve performance bottlenecks. Troubleshoot database issues and provide timely resolution. Implement and maintain database security measures, including user access control, auditing, and encryption. Automate routine database tasks using scripting languages (e.g., Shell, Python, PL/SQL). Create and maintain database documentation. Database Parameter Tuning: In-depth knowledge of Oracle database initialization parameters and their impact on performance, with a particular focus on memory management parameters. Expertise in tuning Oracle memory structures (SGA, PGA) for optimal performance in a GCP environment. This includes: Precisely sizing the SGA components (Buffer Cache, Shared Pool, Large Pool, Java Pool, Streams Pool) based on workload characteristics and available GCP Compute Engine memory resources. Optimizing PGA allocation (PGA_AGGREGATE_TARGET, PGA_AGGREGATE_LIMIT) to prevent excessive swapping and ensure efficient SQL execution. Understanding the interaction between SGA and PGA memory regions and how they are affected by GCP instance memory limits. Tuning the RESULT_CACHE parameters for optimal query performance, considering the available memory and workload patterns. Proficiency in using Automatic Memory Management (AMM) and Automatic Shared Memory Management (ASMM) features and knowing when manual tuning is required for optimal results. Knowledge of how GCP instance memory limits can impact Oracle's memory management and the appropriate adjustments to make. Experience with analysing AWR reports and identifying areas for database parameter optimization, with a strong emphasis on identifying memory-related bottlenecks (e.g., high buffer busy waits, excessive direct path reads/writes). Proficiency in tuning SQL queries using tools like SQL Developer and Explain Plan, particularly identifying queries that consume excessive memory or perform inefficient memory access patterns. Knowledge of Oracle performance tuning methodologies and best practices, specifically as they apply to memory management in a cloud environment. Experience with database indexing strategies and index optimization, understanding the impact of indexes on memory utilization. Solid understanding of Oracle partitioning and its benefits for large databases, including how partitioning can affect memory usage and query performance. Ability to perform proactive performance tuning based on workload analysis and trending, with a focus on memory usage patterns and potential memory-related performance issues. Expertise in diagnosing and resolving memory leaks or excessive memory consumption issues within the Oracle database. Deep understanding of how shared memory segments are managed within the Linux OS on GCP Compute Engine and how to optimize them for Oracle. Data Guard Expertise: Deep understanding of Oracle Data Guard architectures (Maximum Performance, Maximum Availability, Maximum Protection). Expertise in configuring and managing Data Guard broker for automated switchovers and failovers. Experience in troubleshooting Data Guard issues and ensuring data consistency. Knowledge of Data Guard best practices for performance and reliability. Proficiency in performing Data Guard role transitions (switchover, failover) with minimal downtime. Experience with Active Data Guard is a plus. Operating System Tuning: Deep expertise in Linux operating systems (e.g., Oracle Linux, Red Hat, CentOS) and their interaction with Oracle databases. Performance tuning of the Linux operating system for optimal Oracle database performance, including: Kernel parameter tuning (e.g., shared memory settings, semaphores, file descriptor limits). Memory management optimization (e.g., HugePages configuration). I/O subsystem tuning (e.g., disk scheduler selection, filesystem optimization). Network configuration optimization (e.g., TCP/IP parameters). Monitoring and analysis of OS performance metrics using tools like vmstat, iostat, top, and sar. Identifying and resolving OS-level resource contention issues (CPU, memory, I/O). Good to Have: GCP Environment Management: Provision and manage GCP Compute Engine instances for Oracle databases, including selecting appropriate instance types and storage configurations. Configure and manage GCP networking components (VPCs, subnets, firewalls) for secure database access. Utilize GCP Cloud Monitoring and Logging for database monitoring and troubleshooting. Implement and manage GCP Cloud Storage for database backups. Experience with Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager to automate GCP resource provisioning. Cost optimization of Oracle database infrastructure on GCP. Other Products and Platforms Experience with other cloud platforms (AWS, Azure). Experience with NoSQL databases. Experience with Agile development methodologies. Experience with DevOps practices and tools (e.g., Ansible, Chef, Puppet). Experience with GoldenGate. Qualifications: Bachelor's degree in Computer Science or a related field. Minimum 12+ years of experience as an Oracle DBA. Proven experience managing Oracle 12c, 18c, 19c, and 21c single instance with DG and RAC databases in a production environment, with strong Data Guard expertise. Extensive experience with Oracle database performance tuning, including OS-level and database parameter optimization. Hands-on experience with Oracle databases hosted on Google Cloud Platform (GCP). Strong understanding of Linux operating systems. Excellent troubleshooting and problem-solving skills. Strong communication and collaboration skills. Oracle Certified Professional (OCP) certification is highly preferred. GCP certifications (e.g., Cloud Architect, Cloud Engineer) are a plus. Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We’re a brilliant mix of varying ethnicities, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Diverse perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer.
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will be joining Ziance Technologies as an experienced Data Engineer (Gen AI) where your primary role will involve leveraging your expertise in Python and the Azure Tech Stack. Your responsibilities will include designing and implementing advanced data solutions, with a special focus on Generative AI concepts. With 5 - 8 years of experience under your belt, you must possess a strong proficiency in Python programming language. Additionally, you should have hands-on experience with REST APIs, Fast APIs, Graph APIs, and SQL Alchemy. Your expertise in Azure Services such as DataLake, Azure SQL, Function App, and Azure Cognitive Search will be crucial for this role. A good understanding of concepts like Chunking, Embeddings, vectorization, indexing, Prompting, Hallucinations, and RAG will be beneficial. Experience in DevOps, including creating pull PRs and maintaining code repositories, is a must-have skill. Your strong communication skills and ability to collaborate effectively with team members will be essential for success in this position. If you are looking to work in a dynamic environment where you can apply your skills in Azure, Python, and data stack, this role at Ziance Technologies could be the perfect fit for you.,
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the Role We are seeking a Senior Backend Engineer with expertise in Java, Spring Boot, API design, security, and data infrastructure . This role combines backend development with strong data engineering skills , including ETL pipelines, search/indexing systems (Elastic or similar), reporting flows, and scaling PostgreSQL databases . You will be responsible for designing and implementing secure, high-performance backend services while also building data capabilities that support analytics and business insights. Key Responsibilities Backend Services & Architecture Design, develop, and maintain backend services and APIs using Java and Spring Boot . Architect solutions for scalability, performance, and reliability in a microservices/cloud environment. Data Infrastructure & ETL Design and implement ETL pipelines to ingest, transform, and serve data for analytics and reporting. Work on setting up and managing Elasticsearch (or OpenSearch) clusters for search and analytics . Build reporting and data flow pipelines that integrate transactional and analytical data. Database Performance & Scaling Optimize PostgreSQL schemas, queries, and indexes for high-performance data access. Plan for horizontal and vertical scaling, partitioning, and caching strategies for large data volumes. Monitor and resolve database bottlenecks. API Design & Data Access Build robust, secure, and versioned REST APIs (GraphQL experience is a plus). Ensure proper data governance, security, and access control in all backend services. Security & Best Practices Implement strong security practices (Spring Security, OAuth2, JWT). Enforce best practices for code quality, CI/CD, and cloud-native deployments. Collaboration & Mentorship Partner with product managers, frontend engineers, and data analysts. Mentor junior developers and participate in architecture reviews. Required Skills and Qualifications Core Backend Skills 5+ years of experience in backend development with Java and Spring Boot . Strong understanding of object-oriented programming, design patterns, and microservices . Data Engineering / Infrastructure Expertise Hands-on experience building ETL pipelines for reporting and analytics. Experience with Elasticsearch / OpenSearch or similar indexing/search systems . Expertise in PostgreSQL performance tuning, indexing, partitioning, and scaling strategies . API Design & Cloud Proficiency in RESTful API design ; GraphQL experience preferred. Familiarity with containerized deployments (Docker, Kubernetes) and CI/CD. Security & Performance Experience with Spring Security, OAuth2, SSO . Knowledge of profiling, monitoring, and optimizing backend systems. Preferred Qualifications Knowledge of distributed data processing systems (Kafka, Spark, Airflow) . Experience with data warehouses, OLAP tools, or BI/reporting solutions . Exposure to cloud-native data services (AWS RDS, Aurora, OpenSearch, etc.) . Education Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). What We Offer Opportunity to work on mission-critical backend systems and data infrastructure . Competitive salary and comprehensive benefits package. Collaborative and innovative work environment with modern tools and processes.
Posted 6 days ago
6.0 - 10.0 years
8 - 12 Lacs
Rangareddy
Work from Office
As a SQL Developer, with 5 to 8 yrs of experience, you will play a crucial role in designing, implementing, and maintaining the MSSQL database for our organization. This role involves working with other teams to understand their database Required Candidate profile Proven experience as a SQL Developer or similar role.Strong knowledge of SQL and database design principles and data security best practices. Familiarity with data modeling tools.
Posted 6 days ago
6.0 - 10.0 years
8 - 12 Lacs
Sangareddy
Work from Office
As a SQL Developer, with 5 to 8 yrs of experience, you will play a crucial role in designing, implementing, and maintaining the MSSQL database for our organization. This role involves working with other teams to understand their database Required Candidate profile Proven experience as a SQL Developer or similar role.Strong knowledge of SQL and database design principles and data security best practices. Familiarity with data modeling tools.
Posted 6 days ago
6.0 - 10.0 years
8 - 12 Lacs
Medak
Work from Office
As a SQL Developer, with 5 to 8 yrs of experience, you will play a crucial role in designing, implementing, and maintaining the MSSQL database for our organization. This role involves working with other teams to understand their database Required Candidate profile Proven experience as a SQL Developer or similar role.Strong knowledge of SQL and database design principles and data security best practices. Familiarity with data modeling tools.
Posted 6 days ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
As a SQL Developer, with 5 to 8 yrs of experience, you will play a crucial role in designing, implementing, and maintaining the MSSQL database for our organization. This role involves working with other teams to understand their database Required Candidate profile Proven experience as a SQL Developer or similar role.Strong knowledge of SQL and database design principles and data security best practices. Familiarity with data modeling tools.
Posted 6 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... You’ll be responsible for developing highly efficient and reliable applications for various transformational programs as part of the Wireless Billing system. You'll be Enhancing and developing new features for applications using new and emerging technologies while maintaining coding standards and quality. Defining and clarifying project scope. Developing the project plan with coordination from Onshore, Business Owners, and Users. Coming up with designs for Business requirements in consonance with Security, Performance, and User experience aspects Programming using Java / J2EE using Spring framework / Spring Boot/ Reactive and Oracle & Cassandra database. Implementing solutions hands-on, emphasizing reusable code development. Driving discussions with the stakeholders on delivering the project. Coordinating activities across different organizational functions. Leading a team of 5-6 members on application development. Representing the team in the management and executive meetings Systematic problem-solving approach and a sense of ownership, commitment and dedication. Manage competing priorities and adapt to changes in project scope. Follow the AGILE processes as required by the project. Where you'll be working... In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What We’re Looking For... You are curious about new technologies and the possibilities they create. You enjoy the challenge of supporting applications while exploring ways to improve upon the technology. You are driven and motivated, with strong communication and analytical skills, you’re a sought-after team member that thrives in a dynamic work environment. You will be working with multiple stakeholders within wireless teams in understanding and delivering the requirements and design. You’ll Need To Have Bachelor’s degree or three or more years of work experience. Four or more years of relevant experience. Strong knowledge and working experience on Telecom billing Strong End-to-end designing & development experience in Java, Spring Boot, Microservices Strong concepts in Object Oriented Programming and Design Patterns Good understanding on SQL Queries, Linux, Microservices architecture Good Experience in DevOps (Jenkins, SonarQube, HP Fortify) Hands-on experience with Oracle DB, Cassandra, data modelling, data replication, clustering, indexing for handling for large data sets. Knowledge and hands on in Unix, Shell Scripting. Experience in Agile methodologies and DEVSECOPS implementation. Cloud Implementation Experience. Experience in mentoring the junior team members Experience in product/tools evaluations to suit the project needs Experience with Devops tool chain - Jenkins, docker, Git, Sonar Qube, Fortify, Artifactory and Ansible. Proven experience in creating automation tools for productivity and process improvements. Expertise in trouble shooting the issues. Even better if you have one or more of the following: Strong communication and critical thinking skills. Experience working in Agile/SAFe teams. Knowledge of reusable component development and Databases/SQL. Collaboration skills to manage the peers, partners and other stakeholders. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 6 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... You’ll be responsible for developing highly efficient and reliable applications for various transformational programs as part of the Wireless Billing system. You'll be Enhancing and developing new features for applications using new and emerging technologies while maintaining coding standards and quality. Defining and clarifying project scope. Developing the project plan with coordination from Onshore, Business Owners, and Users. Coming up with designs for Business requirements in consonance with Security, Performance, and User experience aspects Programming using Java / J2EE using Spring framework / Spring Boot/ Reactive and Oracle & Cassandra database. Implementing solutions hands-on, emphasizing reusable code development. Driving discussions with the stakeholders on delivering the project. Coordinating activities across different organizational functions. Leading a team of 5-6 members on application development. Representing the team in the management and executive meetings Systematic problem-solving approach and a sense of ownership, commitment and dedication. Manage competing priorities and adapt to changes in project scope. Follow the AGILE processes as required by the project. Where you'll be working... In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What We’re Looking For... You are curious about new technologies and the possibilities they create. You enjoy the challenge of supporting applications while exploring ways to improve upon the technology. You are driven and motivated, with strong communication and analytical skills, you’re a sought-after team member that thrives in a dynamic work environment. You will be working with multiple stakeholders within wireless teams in understanding and delivering the requirements and design. You’ll Need To Have Bachelor’s degree or three or more years of work experience. Four or more years of relevant experience. Strong knowledge and working experience on Telecom billing Strong End-to-end designing & development experience in Java, Spring Boot, Microservices Strong concepts in Object Oriented Programming and Design Patterns Good understanding on SQL Queries, Linux, Microservices architecture Good Experience in DevOps (Jenkins, SonarQube, HP Fortify) Hands-on experience with Oracle DB, Cassandra, data modelling, data replication, clustering, indexing for handling for large data sets. Knowledge and hands on in Unix, Shell Scripting. Experience in Agile methodologies and DEVSECOPS implementation. Cloud Implementation Experience. Experience in mentoring the junior team members Experience in product/tools evaluations to suit the project needs Experience with Devops tool chain - Jenkins, docker, Git, Sonar Qube, Fortify, Artifactory and Ansible. Proven experience in creating automation tools for productivity and process improvements. Expertise in trouble shooting the issues. Even better if you have one or more of the following: Strong communication and critical thinking skills. Experience working in Agile/SAFe teams. Knowledge of reusable component development and Databases/SQL. Collaboration skills to manage the peers, partners and other stakeholders. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 6 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Overview Synoptek We think globally, act locally. As a Managed Services Provider, Synoptek provides world-class strategic IT leadership and hyper-efficient IT operational support, enabling our global client-base to grow and transform their businesses. We are excited to have experienced continuous growth and in keeping with that momentum we are seeking to add talent to our team. When you partner with Synoptek, you engage with an ever-growing, ever-evolving IT organization that provides a high-caliber team, results growth, and clarity. Responsibilities Data Scientist This is an amazing opportunity to work within one of the fastest growing Managed Services Providers. We are a company with a heart and soul dedicated to the ongoing success and growth of our employees and continued business success of the customers we support. We foster a fun and connected environment with employee benefits extending beyond general compensation and into company sponsored events and an invested culture of learning. The Data Scientist is responsible for conducting end-to-end analysis, encompassing requirements, activities, and design. Additionally, the Data Analyst is responsible for developing analysis and reporting capabilities and implementing performance and quality control measures to drive continuous improvement. Role & Responsibilities: Should be able to code in Python and work with simple to complex SQL Work as independent contributor in building AI solutions using latest tech stack. Should be able to research, learn and implement latest models in AI and GenAI Identify, analyse, and interpret trends or patterns in complex datasets. Collaborate with team, to prioritize business and information needs. Serve as a consultant for clients nationwide. Identify and define process improvement opportunities. Excellent problem-solving ability and verbal communication. Experience Atleast 3 years of relevant experience in Building AI solutions. Proficiency in programming languages such as Python, FastAPI, SQL or PySpark(good to have) Experience in developing conversational AI or virtual assistant applications. Proven experience in building and deploying applications using LLMs like OpenAI GPT-4o, Claude, Llama, Grok, Gemini or similar Hands-on experience in building GenAI Applications. Good to have working experience in building Agentic AI applications. Familiarity with acquiring and managing huge and complex data from various sources, including structured and unstructured data sources. Extensive experience with Cloud platforms, particularly Microsoft Azure. Knowledge of natural language processing (NLP) and information retrieval Have knowledge and hands-on experience with LangChain Framework, Azure OpenAI, Azure AI Search Service, Indexing, and Embedding techniques Good to have Application development experience using AI models and API integrations. Hands-on experience in AI, ML and Deep Learning algorithms like Regression, Decision Trees, Neural networks Experience in working with Azure Cosmos DB, Azure Blobs, Azure Functions, and App Service Qualifications Education Bachelor’s degree in Computer Science, Information Technology or related field from an accredited college or university preferred In lieu of undergraduate degree, the ratio is 1:1 - meaning one year of college equals one year of work experience and vice versa Experience Customarily has at least 1 year(s) of job-related experience Familiarity with acquiring and managing data from various sources, including primary and secondary data sources. Skills/Attributes Synoptek core DNA behaviors: Clarity: Possesses excellent communication skills, makes a concentrated effort to speak the customers language. Ability to field questions with concise, well-constructed responses. OwnIT: Shows integrity, innovation, and accountability in completing daily assignments. Results: Solutions focused and driven to resolve conflict quickly and precisely. Proactively looks for opportunities to contribute to the company’s business goals. Growth: Willing to learn and ask questions. Constantly looking for new ways to improve yourself. Ability to adapt and grow in a fast-paced environment. Team: Embraces both customers and colleagues as team members. Ability to be flexible, respectful, engaged and collaborative. Capacity to identify, analyze, and interpret trends or patterns in complex datasets and propose solutions to business challenges. Strong verbal and written communication skills to collaborate with management, prioritize business needs, and work effectively as a consultant with clients. Working Conditions We live by the motto ‘work hard, play hard’ and strive to support our employees in both their professional and personal goals. We believe that by hiring the right people, leading process improvement, and leveraging technology, we achieve superior results. Work is performed primarily in an office or remote environment. May be subject to time constraints and tight deadlines. May require occasional travel. EEO Statement We are proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, veteran status, sexual orientation, gender identity, marital status, pregnancy, genetic information, or any other characteristic protected by law and will not be discriminated against on the basis of disability. It is our intention that all qualified applicants are given equal opportunity and that employment decisions be based on job-related factors.
Posted 6 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: SQL Database Administrator (DBA) Location: Pune Experience Level: 5+ years Job Type: Full-time Job Description: SQL Database Administrator (DBA) We are seeking a skilled SQL Database Administrator (DBA) to manage, maintain, and optimize our SQL Server databases. The ideal candidate will be responsible for ensuring database performance, integrity, security, and availability. Key Responsibilities: Install, configure, and maintain SQL Server databases. Monitor database performance, implement tuning and optimization. Perform regular backups and recovery operations. Manage security, access controls, and data integrity. Automate routine tasks and manage scheduled jobs. Troubleshoot and resolve database issues and outages. Required Skills: Strong knowledge of SQL Server (2016/2019/2022). Experience in T-SQL, stored procedures, triggers, and functions. Proficiency in performance tuning and query optimization. Experience with backup/restore strategies and disaster recovery. Familiarity with SSIS, SSRS, and SSAS is a plus. Understanding of indexing, partitioning, and replication. Working knowledge of Windows Server environments. Strong analytical and problem-solving skills. Employee Benefits: Group Medical Insurance Cab facility Meals/snacks Continuous Learning Program Company Profile: Stratacent is a Global IT Consulting and Services firm, headquartered in Jersey City, NJ, with global delivery centres in Pune and Gurugram plus offices in USA, London, Canada and South Africa. We are a leading IT services provider focusing in Financial Services, Insurance, Healthcare and Life Sciences. We help our customers in their transformation journey and provides services around Information Security, Cloud Services, Data and AI, Automation, Application Development and IT Operations. URL - http://stratacent.com Stratacent India Private Limited is an equal opportunity employer and will not discriminate against any employee or applicant for employment on the basis of race, color, creed, religion, age, sex, national origin, ancestry, handicap, or any other factor protected by law.
Posted 6 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Full Stack Developer Job Location: Hyderabad Notice Period: 15 Days Role Overview: · Play a crucial role in driving Company mission to simplify and innovate construction management. · Collaborate with diverse clients worldwide, helping them transform complex workflows. · Thrive in a fast-paced, tech-driven environment that encourages continuous learning and growth. · Advance your career by delivering real impact on large scale infrastructure and construction projects. Key Responsibilities: · We are looking for a tech enthusiast with a knack for full stack developer. Eager to dive into code and bring ideas to life. · Own features from brainstorming to deployment—handling everything from database architecture to front-end performance. · Optimize and Scale: Ensure that our platform is high-performing, scalable, and future-proof. You will be part of laying the groundwork for big, exciting growth. · Collaborate & Conquer: Work closely with our design, product, and AI teams to integrate machine learning and automation features into our platform, pushing the boundaries of what tech can do in construction. · Write clean, efficient, and maintainable code track record that talks. Required Qualifications: · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Equivalent practical experience may be acceptable with a strong portfolio and leadership track record. · 3+ years of experience with either MEAN (MongoDB, Express, Angular, Nodejs) or MERN (MongoDB, Express, React, Nodejs) stack. · Hands-on experience in designing and building scalable, secure full-stack applications in a microservices or monolithic architecture. · Strong proficiency in Angular 15+, RxJS, NgRx (or other state management libraries). · Solid understanding of TypeScript, JavaScript, HTML5, and CSS3. · Experience building responsive and cross-browser applications. · Familiar with Angular CLI, lazy loading, routing, and component-based architecture. · Proficiency in MongoDB and its query syntax, aggregation framework. · Knowledge of Mongoose (ORM). Understanding of schema design, indexing, and performance tuning. Nice-to-have: · Experience with GraphQL, Socket.IO, or WebRTC. · Understanding of Server-Side Rendering (SSR) using Next.js (for MERN) or Angular Universal (for MEAN). · Knowledge of Redis, Kafka, or other message queues · Familiarity with multi-tenant architecture or SaaS product engineering. What We Offer: · Grow with purpose: Accelerate your career with hands-on learning and expert mentorship. · Culture that empowers: Join a team where your ideas matter and diversity is celebrated. · Perks that matter: Enjoy flexible work options and benefits designed to support your work-life balance. · Make a real impact: Work on advanced solutions that simplify construction and help build smarter cities and communities worldwide. Best regards, G Vanaja HR Manager Velsprint Technologies +91-8074773135
Posted 6 days ago
9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization. Who we are looking for: Primary Responsibilities: Key Responsibilities Architecture & Design: Design and implement comprehensive data architectures using AWS or GCP services Develop data models, schemas, and integration patterns for structured and unstructured data Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines Implement data governance frameworks and ensure compliance with security standards Design disaster recovery and business continuity strategies for data systems Technical Leadership: Lead cross-functional teams in implementing data solutions and migrations Provide technical guidance on cloud data services selection and optimization Collaborate with stakeholders to translate business requirements into technical solutions Drive adoption of cloud-native data technologies and modern data practices Platform Implementation: Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.) Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery) Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub) Implement automated data quality monitoring and validation processes Establish CI/CD pipelines for data infrastructure deployment Performance & Optimization: Monitor and optimize data pipeline performance and cost efficiency Implement data partitioning, indexing, and compression strategies Conduct capacity planning and scaling recommendations Troubleshoot complex data processing issues and performance bottlenecks Establish monitoring, alerting, and logging for data systems Skill: Bachelor’s degree in computer science, Data Engineering, or related field 9+ years of experience in data architecture and engineering 5+ years of hands-on experience with AWS or GCP data services Experience with large-scale data processing and analytics platforms AWS Redshift, S3, Glue, EMR, Kinesis, Lambda AWS Data Pipeline, Step Functions, CloudFormation Big Query, Cloud Storage, Dataflow, Dataproc, Pub/Sub GCP Cloud Functions, Cloud Composer, Deployment Manager IAM, VPC, and security configurations SQL and NoSQL databases Big data technologies (Spark, Hadoop, Kafka) Programming languages (Python, Java, SQL) Data modeling and ETL/ELT processes Infrastructure as Code (Terraform, CloudFormation) Container technologies (Docker, Kubernetes) Data warehousing concepts and dimensional modeling Experience with modern data architecture patterns Real-time and batch data processing architectures Data governance, lineage, and quality frameworks Business intelligence and visualization tools Machine learning pipeline integration Strong communication and presentation abilities Leadership and team collaboration skills Problem-solving and analytical thinking Customer-focused mindset with business acumen Preferred Qualifications: Master’s degree in relevant field Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer) Experience with multiple cloud platforms Knowledge of data privacy regulations (GDPR, CCPA) Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.
Posted 6 days ago
2.0 - 5.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : LLM Model Developer Project Role Description : Fine-tunes Large Language Models with emphasis on instruction fine-tuning and domain adaptation to enhance model relevance and performance in specific contexts. Must have skills : Large Language Models Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a LLM Model Developer, you will engage in the intricate process of fine-tuning Large Language Models, focusing on instruction fine-tuning and domain adaptation. Your typical day will involve collaborating with cross-functional teams to enhance model relevance and performance in specific contexts, ensuring that the models meet the diverse needs of various applications. You will analyze model outputs, iterate on training processes, and implement strategies to optimize performance, all while maintaining a keen awareness of the latest advancements in the field of artificial intelligence and machine learning. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and implement innovative strategies for model fine-tuning and adaptation.- Mentor junior team members to enhance their skills and knowledge in model development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models.- Good To Have Skills: Experience with natural language processing frameworks.- Strong understanding of machine learning principles and practices.- Familiarity with data preprocessing techniques specific to language models.- Experience in evaluating model performance using various metrics. Additional Information:- The candidate should have minimum 5 years of experience in Large Language Models.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 6 days ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
This role is for one of Weekday's clients Min Experience: 7 years Location: Gurugram JobType: full-time Requirements We are looking for an experienced Data Engineer with deep expertise in Azure and/or AWS Databricks to join our growing data engineering team. As a Senior Data Engineer, you will be responsible for designing, building, and optimizing data pipelines, enabling seamless data integration and real-time analytics. This role is ideal for professionals who have hands-on experience with cloud-based data platforms, big data processing frameworks, and a strong understanding of data modeling, pipeline orchestration, and performance tuning. You will work closely with data scientists, analysts, and business stakeholders to deliver scalable and reliable data infrastructure that supports high-impact decision-making across the organization. Key Responsibilities: Design and Development of Data Pipelines: Design, develop, and maintain scalable and efficient data pipelines using Databricks on Azure or AWS. Integrate data from multiple sources including structured, semi-structured, and unstructured datasets. Implement ETL/ELT pipelines for both batch and real-time processing. Cloud Data Platform Expertise: Use Azure Data Factory, Azure Synapse, AWS Glue, S3, Lambda, or similar services to build robust and secure data workflows. Optimize storage, compute, and processing costs using appropriate services within the cloud environment. Data Modeling & Governance: Build and maintain enterprise-grade data models, schemas, and lakehouse architecture. Ensure adherence to data governance, security, and privacy standards, including data lineage and cataloging. Performance Tuning & Monitoring: Optimize data pipelines and query performance through partitioning, caching, indexing, and memory management. Implement monitoring tools and alerts to proactively address pipeline failures or performance degradation. Collaboration & Documentation: Work closely with data analysts, BI developers, and data scientists to understand data requirements. Document all processes, pipelines, and data flows for transparency, maintainability, and knowledge sharing. Automation and CI/CD: Develop and maintain CI/CD pipelines for automated deployment of data pipelines and infrastructure using tools like GitHub Actions, Azure DevOps, or Jenkins. Implement data quality checks and unit tests as part of the data development lifecycle. Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 7+ years of experience in data engineering roles with hands-on experience in Azure or AWS ecosystems. Strong expertise in Databricks (including notebooks, Delta Lake, and MLflow integration). Proficiency in Python and SQL; experience with PySpark or Spark strongly preferred. Experience with data lake architectures, data warehouse platforms (like Snowflake, Redshift, Synapse), and lakehouse principles. Familiarity with infrastructure as code (Terraform, ARM templates) is a plus. Strong analytical and problem-solving skills with attention to detail. Excellent verbal and written communication skills.
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough