Home
Jobs

1651 Indexing Jobs - Page 41

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 years

0 Lacs

Thalassery, Kerala, India

On-site

Linkedin logo

Job Title: SEO Specialist (Python/Django & HTML Projects). Location: Thalassery, Kerala (Only candidates from Kerala will be considered). Job Type: Full-time / Contract. Experience Required: Minimum 1+ year. Contact: Send your CV to hr@nexxalgn.com or WhatsApp +91 98472 52577 (WhatsApp only). Job Description: We are hiring a dedicated and skilled SEO Specialist to join our team for ongoing and upcoming projects built on Python (Django) and HTML frameworks . This role is ideal for someone who has at least 1 year of hands-on SEO experience and a solid understanding of how technical SEO applies to Django-based websites. Note: We are only considering candidates based in Kerala , preferably near Thalassery . Key Responsibilities: Execute on-page, off-page, and technical SEO strategies for Python/Django and HTML-based websites. Conduct regular SEO audits, fix crawl errors, improve site speed, and enhance indexing. Collaborate with developers to optimize Django templates, routing, and static content. Analyze traffic and keyword performance using Google Search Console, Analytics, Ahrefs, SEMrush, etc. Keep up-to-date with the latest trends and algorithm changes in SEO. Requirements: Minimum 1+ year of SEO experience. Strong knowledge of HTML, CSS , and basic familiarity with Python Django project structure. Proficiency in SEO tools such as Google Analytics, Google Search Console, Ahrefs, SEMrush. Understanding of technical SEO elements like sitemap, robots.txt, canonical tags, and schema. Strong analytical, problem-solving, and communication skills. Preferred Skills: Knowledge of Django templating and routing. Experience with performance tools (e.g., Google Lighthouse, PageSpeed Insights). Ability to implement structured data (JSON-LD, Microdata). 📧 To Apply: Send your updated CV to hr@nexxalgn.com. 📱 WhatsApp us at +91 98472 52577 (WhatsApp only). Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About PurpleMerit PurpleMerit is an AI-focused technology startup dedicated to building innovative, scalable, and intelligent software solutions. We leverage the latest advancements in artificial intelligence and cloud technology to deliver impactful products for a global audience. As a fully remote team, we value skill, curiosity, and a passion for solving real-world problems over formal degrees or prior experience. Job Description We are seeking motivated and talented Full Stack Developers to join our dynamic team. This position is structured as a mandatory internship-to-full-time pathway, designed to nurture and evaluate your technical and collaborative skills in a real-world environment. You will work on a variety of projects, including web applications, Chrome extensions, PWA apps, and full-stack software solutions, with a strong emphasis on system design and AI integration. Roles & Responsibilities Design, develop, and maintain robust, scalable web applications and software solutions. Build end-to-end applications, including websites, Chrome extensions, and PWAs. Architect systems with a strong focus on system design, scalability, and maintainability. Develop RESTful and GraphQL APIs for seamless frontend-backend integration. Implement secure authentication and authorization (OAuth, JWT, session management, role-based access). Integrate AI tools and APIs; demonstrate a basic understanding of AI agents and prompt engineering. Manage cloud infrastructure (AWS or Azure) and CI/CD pipelines for efficient deployment. Perform basic server management (Linux/Unix, Nginx, Apache). Design and optimize databases ( schema design , normalization, indexing, query optimization). Ensure code quality through testing and adherence to best practices. Collaborate effectively in a remote, agile startup environment. Required Skills Strong understanding of system design and software architecture. Experience with CI/CD pipelines and cloud platforms (AWS or Azure). Proficiency with version control systems (Git). Knowledge of API development (RESTful and GraphQL). Familiarity with authentication and authorization protocols (OAuth, JWT, sessions, RBAC). Basic server management skills (Linux/Unix, Nginx, Apache). Database design and optimization skills. Experience integrating AI tools or APIs; Basic understanding of AI agents. Basic knowledge of prompt engineering. Commitment to testing and quality assurance. Ability to build complete, production-ready applications. No formal degree or prior experience required; a strong willingness to learn is essential. Salary Structure 1. Pre-Qualification Internship (Mandatory) Duration: 2 months Stipend: ₹5,000/month Purpose: Evaluate foundational skills, work ethic, and cultural fit. 2. Internship (Mandatory) Duration: 3 months Stipend: ₹7,000–₹15,000/month (based on pre-qualification performance) Purpose: Deepen technical involvement and demonstrate capability. 3. Full-Time Employment Salary: ₹3 LPA – ₹9 LPA (performance-based, determined during internships) Note: Full-time offers are extended only upon successful completion of both internship stages. Why Our Salary Structure is Unique At PurpleMerit, we recognize the challenges of remote hiring in the AI era, where traditional interviews can be unreliable due to the widespread use of AI tools. To ensure genuine skills, cultural fit, and work ethic, we have implemented a structured pathway to full-time employment. This process allows both you and PurpleMerit to evaluate fit through real-world collaboration before making a long-term commitment. We believe in “try and then decide”—not just interviews—because we want to build a team based on real performance and trust. Why Join PurpleMerit? 100% remote work with a flexible schedule. Direct involvement in building AI-driven products from the ground up. Mentorship from experienced engineers and founders. Transparent growth path from internship to full-time employment. Merit-based culture—your skills and contributions are what matter. Opportunity to work on diverse projects and cutting-edge technologies. Your Impact At PurpleMerit, you will: Directly influence the architecture and development of innovative AI products. Solve complex challenges and see your solutions implemented in real products. Help shape our engineering culture and set high standards for quality. Accelerate your growth as a developer in a supportive, fast-paced environment. If you are passionate about building impactful software and eager to work in an AI-driven startup, we encourage you to apply. Join us at PurpleMerit and be a part of our journey to innovate and excel.  Apply now to start your career with PurpleMerit! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

This job is with Kyndryl, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Within our Database Administration team at Kyndryl, you'll be a master of managing and administering the backbone of our technological infrastructure. You'll be the architect of the system, shaping the base definition, structure, and documentation to ensure the long-term success of our business operations. Your expertise will be crucial in configuring, installing and maintaining database management systems, ensuring that our systems are always running at peak performance. You'll also be responsible for managing user access, implementing the highest standards of security to protect our valuable data from unauthorized access. In addition, you'll be a disaster recovery guru, developing strong backup and recovery plans to ensure that our system is always protected in the event of a failure. Your technical acumen will be put to use, as you support end users and application developers in solving complex problems related to our database systems. As a key player on the team, you'll implement policies and procedures to safeguard our data from external threats. You will also conduct capacity planning and growth projections based on usage, ensuring that our system is always scalable to meet our business needs. You'll be a strategic partner, working closely with various teams to coordinate systematic database project plans that align with our organizational goals. Your contributions will not go unnoticed - you'll have the opportunity to propose and implement enhancements that will improve the performance and reliability of the system, enabling us to deliver world-class services to our customers. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career, from Junior Administrator to Architect. We have training and upskilling programs that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. One of the benefits of Kyndryl is that we work with customers in a variety of industries, from banking to retail. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others. Required Technical and Professional Expertise Bachelor's degree in Computer Science, Information Technology, or a related field. 5-8 years of proven hands-on experience in SQL database design, development, administration, and performance tuning. Expertise in a specific SQL Server platform (e.g., Microsoft SQL Server, PostgreSQL, MySQL). Experience with multiple platforms is a plus. Strong proficiency in writing complex SQL queries, stored procedures, functions, and triggers. Solid understanding of database concepts, including relational database theory, normalization, indexing, and transaction management. Experience with database performance monitoring and tuning tools. Experience with database backup and recovery strategies. Knowledge of database security principles and best practices. Experience with data migration and integration tools and techniques (e.g., ETL processes). Excellent analytical, problem-solving, and troubleshooting skills. Strong communication and collaboration skills. Ability to work independently and as part of a team Preferred Technical And Professional Experience Relevant certifications (e.g., Microsoft Certified: Database Administrator, Oracle Database Administrator). Experience with cloud-based database services (e.g., Azure SQL Database, AWS RDS, Google Cloud SQL). Experience with NoSQL databases. Knowledge of scripting languages (e.g., Python, PowerShell). Experience with data warehousing concepts and technologies. Familiarity with Agile development methodologies Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address. Show more Show less

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

Tardeo, Mumbai, Maharashtra

On-site

Indeed logo

Role : Senior Dot Net Developer Job type: Full time. Role type: Technical. Location: Mumbai Mid ‐level Resource, with 3+ Years of Experience Extensive experience with Microsoft technologies including.NET, ASP.Net Core MVC, C#, MS SQL Server. WPF, WCF C#, ASP.NET, XML, XSL, scripting languages including JQuery/JavaScript and HTML. Working on ASP.Net Core2 MVC is added advantage. Good to have knowledge with SQL Server 2012, indexing and queries and SSIS/SSRS. Has Implemented Ajax Controls in C# .Net Projects Complete understanding of MS SQL Database. Data modelling to visualize database structure Good understanding of Reviewing query performance and optimizing code Designing and coding database tables to store the application’s data Creating database triggers, stored procedures & functions Creating table indexes to improve database performance Has experience in writing unit tests & performing unit tests on own code About Andesoft Consulting : Andesoft is a boutique interactive services shop strategically combining business analytics and design. The primary domain expertise covers, Web architecture, CMS, and CRM technologies Market and business analytics to achieve better market segmentation and campaign management Custom offline and on-line interactive applications. Some of the business verticals we specialize in include Health Care, Financial Services, and Public and Non-profit Sectors. Company Profile: http://www.andesoftconsulting.com Qualification & Experience: ● Engineering Graduate or Post Graduate. ● BS degree in Information Technology, Computer Science or equivalent ● 3 Years of Professional Experience. Qualification & Experience: ● Engineering Graduate or Post Graduate. ● BS degree in Information Technology, Computer Science or equivalent ● 3 Years of Professional Experience. Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹1,000,000.00 per year Location Type: In-person Schedule: Day shift Ability to commute/relocate: Tardeo, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: relevant work: 3 years (Required) Work Location: In person Expected Start Date: 10/06/2025

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Company Description Seosaph-infotech is a rapidly growing company specializing in customized software development. In just two years, we have delivered exceptional solutions to industries including Finance, Healthcare, and E-commerce. We aim to transform our clients into vertical leaders by integrating optimal technology solutions and providing trusted services. As an IT junction, Seosaph offers multiple tech solutions to businesses from various verticals. MUST HAVE : Requires SQL Server and Couchbase Responsibilities : Design, write, and optimize complex N1QL queries to ensure high-performance Couchbase databases. Develop and maintain .NET application components to integrate seamlessly with Couchbase databases. Troubleshoot and resolve database performance issues to ensure reliability and scalability. Stay updated with Couchbase advancements and drive the adoption of new features. Participate in Agile development processes, including sprint planning and code Skills and Qualifications : Senior-level expertise in Couchbase and N1QL query development, with a proven ability to optimize complex queries. Strong understanding of Couchbase architecture, indexing, and performance tuning. Basic programming experience, preferably in C# or .NET frameworks. Familiarity with Agile methodologies and collaborative development environments. Excellent problem-solving skills and a keen attention to detail. 5+ years of professional experience working with Couchbase and Skills : Mid-level experience in .NET development, including C#, ASP.NET, and Entity Framework. Experience with other NoSQL databases (e.g., MongoDB, Cassandra). Knowledge of RESTful APIs and microservices architecture. 2+ years of experience with .NET : Office at Bangalore, Hyrbid or Remote. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

170.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Area(s) of responsibility About Us Empowered By Innovation Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft, with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Job Description: DB Developer Position: DB Developer Location: Mumbai Experience: 4-6 years Position Overview: We are seeking a skilled Database Developer to design, develop, and maintain efficient database systems. The ideal candidate will have strong expertise in database programming, optimization, and troubleshooting to ensure high availability and performance of database solutions that support our applications. Responsibilities Design, develop, and maintain scalable database systems based on business needs. Write complex SQL queries, stored procedures, triggers, and functions. Optimize database performance, including indexing, query tuning, and normalization. Implement and maintain database security, backup, and recovery strategies. Collaborate with developers to integrate databases with application solutions. Troubleshoot database issues and ensure high availability and reliability. Design and maintain data models and database schema. Create ETL (Extract, Transform, Load) processes for data migration and transformation. Monitor database performance and provide recommendations for improvements. Document database architecture, procedures, and best practices. Qualifications Bachelor’s degree in computer science, Information Technology, or related field. Proven experience as a Database Developer or similar role. Proficiency in database technologies such as SQL Server, Oracle, MySQL, or PostgreSQL. Expertise in writing complex SQL scripts and query optimization. Experience with database tools like SSIS, SSRS, or Power BI. Familiarity with NoSQL databases like MongoDB or Cassandra (optional). Strong knowledge of database security, data modeling, and performance tuning. Hands-on experience with ETL processes and tools. Knowledge of cloud-based database solutions (AWS RDS, Azure SQL, etc.). Excellent problem-solving skills and attention to detail. Preferred Skills Experience in Agile/Scrum methodologies. Knowledge of scripting languages like Python, PowerShell, or Shell scripting. Familiarity with DevOps practices for database deployment and CI/CD pipelines. Show more Show less

Posted 2 weeks ago

Apply

9.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and deliver ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. What You Will Do Let’s do this. Let’s change the world. In this vital role We are seeking a strategic and hands-on Specialist Software Engineer / AI Engineer – Search to lead the design, development, and deployment of AI-powered search and knowledge discovery solutions across our pharmaceutical enterprise. In this role, you'll manage a team of engineers and work closely with data scientists, oncologists, and domain experts to build intelligent systems that help users across R&D, medical, and commercial functions find relevant, actionable information quickly and accurately. Architect and lead the development of scalable, intelligent search systems leveraging NLP, embeddings, LLMs, and vector search Own the end-to-end lifecycle of search solutions, from ingestion and indexing to ranking, relevancy tuning, and UI integration Build systems that surface scientific literature, clinical trial data, regulatory content, and real-world evidence using semantic and contextual search Integrate AI models that improve search precision, query understanding, and result summarization (e.g., generative answers via LLMs). Partner with platform teams to deploy search solutions on scalable infrastructure (e.g., Kubernetes, cloud-native services, Databricks, Snowflake). Experience in Generative AI on Search Engines Experience in integrating Generative AI capabilities and Vision Models to enrich content quality and user engagement. Building and owning the next generation of content knowledge platforms and other algorithms/systems that create high quality and unique experiences. Designing and implementing advanced AI Models for entity matching, data duplication. Experience Generative AI tasks such as content summarization. deduping and metadata quality. Researching and developing advanced AI algorithms, including Vision Models for visual content analysis. Implementing KPI measurement frameworks to evaluate the quality and performance of delivered models, including those utilizing Generative AI. Developing and maintaining Deep Learning models for data quality checks, visual similarity scoring, and content tagging. Continually researching current and emerging technologies and proposing changes where needed. Implement GenAI solutions, utilize ML infrastructure, and contribute to data preparation, optimization, and performance enhancements. Manage and mentor a cross-functional engineering team focused on AI, ML, and search infrastructure. Foster a collaborative, high-performance engineering culture with a focus on innovation and delivery. Work with domain experts, data stewards, oncologists, and product managers to align search capabilities with business and scientific need Basic Qualifications: Degree in computer science & engineering preferred with 9-12 years of software development experience Proficient in AI Model, NLM, AI, Python, Elasticsearch OR Solr OR OpenSearch, GraphQL, No SQL, Cloud CI/CD build pipelines, API Proven experience building search systems with technologies like Elasticsearch, Solr, OpenSearch, or vector DBs (e.g., Pinecone, FAISS). Hands-on experience with various AI models, GCP Search Engines, GCP Cloud services Strong understanding of NLP, embeddings, transformers, and LLM-based search applications Proficient in programming language AI/ML, Python, GraphQL, Java Crawlers, Java Script, SQL/NoSQL, Databricks/RDS, Data engineering, S3 Buckets, dynamo DB Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Experience deploying ML services and search infrastructure in cloud environments (AWS, Azure, or GCP) Preferred Qualifications: Experience in AI/ML, Java, Rest API, Python Proficient in Databricks, Java Experienced with Fast Pythons API Experience with design patterns, data structures, data modelling, data algorithms Knowledge of ontologies and taxonomies such as MeSH, SNOMED CT, UMLS, or MedDRA. Familiarity with MLOps, CI/CD for ML, and monitoring of AI models in production. Experienced with AWS /Azure Platform, building and deploying the code Experience in PostgreSQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, S3 Buckets Experience in Google cloud Search and Google cloud Storage Experience with popular large language models Experience with LangChain or LlamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Experience with generative AI or retrieval-augmented generation (RAG) frameworks in pharma/biotech setting Experience in Agile software development methodologies Experience in End-to-End testing as part of Test-Driven Development Good To Have Skills Willingness to work on Full stack Applications Experience working with biomedical or scientific data (e.g., PubMed, clinical trial registries, internal regulatory databases). Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What You Can Expect From Us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less

Posted 2 weeks ago

Apply

45.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role Name: BI Platform Administrator Job Posting Title : BI Platform Administrator Workday Job Profile : BI Platform Administrator Department Name: Digital, Technology & Innovation Location: Hyderabad, India Job Type: Full-time About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 45 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: The role is responsible for performance monitoring, maintenance, and reliable operation of BI Platforms, BI servers and database. This role involves managing BI Servers and User Admin Management for different environments, ensuring data is stored and retrieved efficiently, and safeguarding sensitive information and ensuring the uptime, performance, and security of IT infrastructure & Software maintenance. We are seeking a skilled BI Platform Administrator to manage, maintain, and optimize our enterprise Power BI and Tableau platforms . The ideal candidate will ensure seamless performance, governance, user access, platform upgrades, troubleshooting, and best practices across our BI environments. Roles & Responsibilities: Administer and maintain Power BI Service, Power BI Report Server, and Tableau Server/Online/any Cloud platforms (AWS, Azure/GCP). Preferred AWS Cloud experience. Configure, monitor, and optimize performance, capacity, and availability of BI platforms. Set up and manage user roles, permissions, and security policies. Manage BI platform upgrades, patches, and migrations. Monitor scheduled data refreshes and troubleshoot failures. Implement governance frameworks to ensure compliance with data policies. Collaborate with BI developers, data engineers, and business users for efficient platform usage. Automate routine administrative tasks using scripts (PowerShell, Python, etc.). Create and maintain documentation of configurations and operational procedures. Install, configure, and maintain BI tools on different operating systems, servers, and applications to ensure their reliability and performance Monitor Platform performance and uptime, addressing any issues that arise promptly to prevent service interruptions Implement and maintain security measures to protect Platforms from unauthorized access, vulnerabilities, and other threats Manage backup procedures and ensure data is securely backed up and recoverable in case of system failures Provide technical support to users, troubleshooting and resolving issues related to system access, performance, and software Apply operating system updates, patches, and configuration changes as necessary Maintain detailed documentation of Platform configurations, procedures, and change management Work closely with network administrators, database administrators, and other IT professionals to ensure that Platforms are integrated and functioning optimally Install, configure, and maintain database management Platforms (BI), ensuring services are reliable and perform optimally Monitor and optimize database performance, including query tuning, indexing, and resource allocation Maintain detailed documentation of Platform configurations, procedures, and policies Work closely with developers, Date Engineers, system administrators, and other IT staff to support database-related needs and ensure optimal platform performance Basic Qualifications and Experience: Over all 5+ years of experience in maintaining Administration on BI Platforms is preferred. 3+ years of experience administering Power BI Service and/or Power BI Report Server. 2+ years of experience administering Tableau Server or Tableau Cloud. Strong knowledge of Active Directory, SSO/SAML, and Role-Based Access Control (RBAC). Experience with platform monitoring and troubleshooting (Power BI Gateway logs, Tableau logs, etc.). Scripting experience (e.g., PowerShell, DAX, or Python) for automation and monitoring. Strong understanding of data governance, row-level security, and compliance practices. Experience working with enterprise data sources (SQL Server, Snowflake, Oracle, etc.). Familiarity with capacity planning, load balancing, and scaling strategies for BI tools. Functional Skills: Should Have: Knowledge of Power BI Premium Capacity Management and Tableau Resource Management. Experience integrating BI platforms with CI/CD pipelines and DevOps tools. Hands-on experience in user adoption tracking, audit logging, and license management. Ability to conduct health checks and implement performance tuning recommendations. Understanding of multi-tenant environments or large-scale deployments. Good to Have: Experience with Power BI REST API or Tableau REST API for automation. Familiarity with AWS Services and/or AWS equivalents. Background in data visualization or report development for better user collaboration. Exposure to other BI tools (e.g., Looker, Qlik, MicroStrategy). Knowledge of ITIL practices or experience working in a ticket-based support environment. Experience in a regulated industry (finance, healthcare, etc.) with strong compliance requirements. Education & Experience : Master’s degree with 1-2+ years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2-3+ years of experience in Business, Engineering, IT or related field OR Diploma with 5+ years of experience in Business, Engineering, IT or related field Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. Show more Show less

Posted 2 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and deliver ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. What You Will Do Let’s do this. Let’s change the world. In this vital role We are seeking a highly skilled and hands-on Senior Software Engineer – Search to drive the development of intelligent, scalable search systems across our pharmaceutical organization. You'll work at the intersection of software engineering, AI, and life sciences to enable seamless access to structured and unstructured content—spanning research papers, clinical trial data, regulatory documents, and internal scientific knowledge. This is a high-impact role where your code directly accelerates innovation and decision-making in drug development and healthcare delivery Design, implement, and optimize search services using technologies such as Elasticsearch, OpenSearch, Solr, or vector search frameworks. Collaborate with data scientists and analysts to deliver data models and insights. Develop custom ranking algorithms, relevancy tuning, and semantic search capabilities tailored to scientific and medical content Support the development of intelligent search features like query understanding, question answering, summarization, and entity recognition Build and maintain robust, cloud-native APIs and backend services to support high-availability search infrastructure (e.g., AWS, GCP, Azure Implement CI/CD pipelines, observability, and monitoring for production-grade search systems Work closely with Product Owners, Tech Architect. Enable indexing of both structured (e.g., clinical trial metadata) and unstructured (e.g., PDFs, research papers) content Design & develop modern data management tools to curate our most important data sets, models and processes, while identifying areas for process automation and further efficiencies Expertise in programming languages such as Python, Java, React, typescript, or similar. Strong experience with data storage and processing technologies (e.g., Hadoop, Spark, Kafka, Airflow, SQL/NoSQL databases). Demonstrate strong initiative and ability to work with minimal supervision or direction Strong experience with cloud infrastructure (AWS, Azure, or GCP) and infrastructure as code like Terraform In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (e.g. star schema, entitlement implementations, SQL v/s NoSQL modeling, milestoning, indexing, partitioning) Experience in REST and/or GraphQL Experience in creating Spark jobs for data transformation and aggregation Experience with distributed, multi-tiered systems, algorithms, and relational databases. Possesses strong rapid prototyping skills and can quickly translate concepts into working code Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Analyze and understand the functional and technical requirements of applications Identify and resolve software bugs and performance issues Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Basic Qualifications: Degree in computer science & engineering preferred with 6-8 years of software development experience Proficient in Databricks, Data engineering, Python, Search algorithms using NLP/AI models, GCP Cloud services, GraphQL Hands-on experience with search technologies (Elasticsearch, Solr, OpenSearch, or Lucene). Hands on experience with Full Stack software development. Proficient in programming languages, Java, Python, Fast Python, Databricks/RDS, Data engineering, S3Buckets, ETL, Hadoop, Spark, airflow, AWS Lambda Experience with data streaming frameworks (Apache Kafka, Flink). Experience with cloud platforms (AWS, Azure, Google Cloud) and related services (e.g., S3, Redshift, Big Query, Databricks) Hands on experience with various cloud services, understand pros and cons of various cloud services in well architected cloud design principles Working knowledge of open-source tools such as AWS lambda. Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Preferred Qualifications: Experience in Python, Java, React, Fast Python, Typescript, JavaScript, CSS HTML is desirable Experienced with API integration, serverless, microservices architecture. Experience in Data bricks, PySpark, Spark, SQL, ETL, Kafka Solid understanding of data governance, data security, and data quality best practices Experience with Unit Testing, Building and Debugging the Code Experienced with AWS /Azure Platform, Building and deploying the code Experience in vector database for large language models, Databricks or RDS Experience with DevOps CICD build and deployment pipeline Experience in Agile software development methodologies Experience in End-to-End testing Experience in additional Modern Database terminologies. Good To Have Skills Willingness to work on AI Applications Experience in MLOps, React, JavaScript, Java, GCP Search Engines Experience with popular large language models Experience with LangChain or LlamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global teams. High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What You Can Expect From Us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: SEO Specialist (Remote / Contractor) About Famous: Famous is a leading creator marketplace where brands effortlessly connect with creators to drive authentic, high-impact content campaigns. With over 100K+ ready-to-order creators, curated industry collections, transparent real-time pricing, advanced performance insights, and seamless payment management, Famous is redefining the brand-creator collaboration. Role Overview: We're seeking an experienced and passionate SEO Specialist to join our remote team. Your primary goal will be to strategically enhance our SEO presence, ensuring Famous ranks high for relevant keywords, driving organic traffic, and maximizing visibility in our industry. Responsibilities: Conduct thorough keyword research to identify high-impact opportunities. Optimize website content, metadata, and landing pages to boost organic search rankings. Regularly audit technical SEO elements (website speed, mobile optimization, indexing, etc.) to ensure top-notch performance. Collaborate with content and marketing teams to plan and execute SEO-driven content strategies, including blogs, articles, and industry reports. Monitor, analyze, and report on SEO performance using tools such as Google Analytics, Google Search Console, Ahrefs, SEMrush, etc. Stay current with SEO industry trends, algorithms, and best practices to continuously refine strategies. Build high-quality backlink strategies and partnerships to improve domain authority and search rankings. Develop and implement localized SEO strategies for targeted market segments. Qualifications: 3+ years of proven SEO experience, preferably with marketplaces, SaaS, or digital platforms. Demonstrable success in developing and executing effective SEO strategies that improved rankings and increased organic traffic. Proficiency in SEO tools (Google Analytics, Google Search Console, SEMrush, Ahrefs, Screaming Frog, Moz). Excellent understanding of search engine algorithms and ranking methods. Experience with HTML, CSS, and JavaScript as they relate to SEO. Strong analytical skills and data-driven mindset. Exceptional communication and collaboration skills. Ability to thrive in a remote and highly collaborative environment. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Company Description Aarcalev Technology Solutions Pvt. Limited is a global company focused on transforming organizations and individuals with services in technology jobs, business optimization, IT development, cybersecurity, cloud technology, and emerging technologies like AI and blockchain. We are dedicated to environmental protection and sustainability. Role Description This is a full-time hybrid role for a Full Stack Engineer - C#, .Net at Aarcalev Technology Solutions. The engineer will be responsible for both back-end and front-end web development tasks, software development, and using CSS. This role will be primarily remote work. Qualifications Full-Stack Developer Requirements C# (.NET) · Minimum 3 years of hands-on experience with C# and the .NET framework (.NET 6+ preferred) · Strong understanding of object-oriented programming, dependency injection, and asynchronous programming · Experience building and maintaining RESTful APIs and microservices · Familiarity with Entity Framework (EF Core) and LINQ .NET · Proven experience with .NET (Web API) · Ability to build secure, scalable, and testable backend services · Experience with middleware, routing, model binding, and authentication/authorization patterns (e.g., JWT, OAuth) · Familiarity with unit testing frameworks like xUnit or NUnit Angular · Minimum 2–3 years of experience with Angular (v10+) · Strong knowledge of component-based architecture, RxJS, and Angular CLI · Experience with state management (e.g., NgRx, BehaviorSubjects) is a plus · Ability to consume and integrate REST APIs into Angular services HTML / CSS / JavaScript · Strong command of semantic HTML5, modern CSS (Flexbox/Grid), and vanilla JavaScript (ES6+) · Experience creating responsive UI/UX using frameworks like Angular Material is a plus · Familiarity with cross-browser compatibility, accessibility standards (WCAG), and browser dev tools SQL Server · 2–3 years of experience with SQL Server (2016 or newer) · Ability to write and optimize complex T-SQL queries, stored procedures, views, and functions · Familiarity with database normalization, indexing, query performance tuning, and data migration · Experience with SQL Server Management Studio (SSMS) and database design API Development · Experience designing, developing, and documenting RESTful APIs · Knowledge of OpenAPI/Swagger, Postman, and versioning best practices · Understanding of HTTP methods, status codes, authentication, and rate limiting · Bonus: experience with GraphQL or third-party integrations Preferred Experience · Minimum 3–5 years total experience as a full-stack developer · Comfortable working in agile/scrum environments · Familiarity with Git, CI/CD pipelines, and basic DevOps workflows · Strong communication skills, capable of working remotely and independently Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: Software Engineer - Distributed Computing & Database Internals Specialist Location: Noida Experience: Minimum 2+ Years Engineer Large-Scale Distributed Systems & Optimize Core Execution Engines! At Zettabolt , we don’t just write code—we engineer high-performance databases and distributed systems that power large-scale applications. Our work spans low-level optimizations in database internals to high-level system-wide debugging and architecture design . If you love deep-diving into execution engines, query optimizations, and database internals , while also understanding how these components interact at scale , this is the role for you! What You’ll Do: Worm’s-Eye View (Deep Dive into Internals): Work at the source code level of distributed computing frameworks (e.g., Apache Spark ) to optimize query execution, indexing, storage engines, and concurrency mechanisms . Improve execution efficiency , optimize computation overhead , and fine-tune scheduling and resource management in distributed environments. Enhance query processing engines , execution planners, and cost-based optimizers. Bird’s-Eye View (System-Level Debugging & Architecture): System-Level Debugging – Identify bottlenecks across compute, network, and storage layers to improve overall system efficiency. Architect and optimize end-to-end distributed systems , ensuring smooth integration of components like databases, caching layers, compute nodes, and messaging queues . Understand the Business Impact – Work closely with customers to align technical solutions with their business needs , ensuring performance optimizations translate to tangible cost and efficiency benefits . What You Bring to the Table: 2+ years of hardcore coding experience in Python, Java, C++, or similar languages . Strong understanding of database internals (query execution, indexing, transaction processing, storage engines). Experience in system-level debugging , identifying performance bottlenecks across compute, memory, storage, and network layers . Strong knowledge of Data Structures & Algorithms . Hands-on experience working with SQL/NoSQL databases and Linux systems . Bonus Points (Good to Have): Experience modifying or extending open-source distributed systems (e.g., Apache Spark, Hadoop, Presto ). Exposure to cloud platforms (AWS, Azure, Google Cloud). Familiarity with containerization tools like Docker or Kubernetes. Why Zettabolt? Health & Life Insurance Flexible Working Hours Breakfast, Lunch, and Snacks (in-office) Biannual Team Retreats (Company-sponsored trips to exciting destinations!) Reimbursement for Books & Training Continuous Learning Sessions (C++/Java/DS/Algo/Big Data/Database Internals & more!) About Zettabolt: Founded by industry veterans , Zettabolt is at the forefront of Big Data, Cloud Operations, and High-Performance Computing . We don’t just write software —we engineer large-scale distributed systems that power business-critical applications. From deep optimizations in database internals to high-level system debugging and architecture design , we connect the dots between technology and business impact . If you want to truly understand how large-scale systems work—from the lowest level of execution internals to the highest level of architectural decision-making—this is the place for you! More More Details : https:// zettabolt.com/jobs Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Responsibilities: We are seeking a PostgreSQL Subject Matter Expert (SME) with in-depth knowledge of PostgreSQL internals, and extensive hands-on experience with enterprise-level PostgreSQL implementations. The ideal candidate will provide expert guidance, drive best practices, and contribute to the optimization and scalability of our PostgreSQL database systems. The SME will be responsible for guiding complex database projects, offering strategic insights, and ensuring best practices are followed in all aspects of database management. · Should be able to provide advanced expertise in PostgreSQL internals, including storage, indexing, query optimization, and memory management. · Provide expert advice on PostgreSQL architecture, design, and best practices to internal teams and stakeholders. · Oversee the installation, configuration, upgrading, and maintenance of PostgreSQL databases and related applications. · Lead and mentor development and database administration teams in PostgreSQL best practices and advanced features. · Analyse and optimize database performance, including query tuning, indexing strategies, and hardware utilization. · Design and implement high availability and disaster recovery solutions, such as replication, clustering, and failover strategies. · Troubleshoot and resolve complex database issues, ensuring minimal downtime and optimal performance. · Develop and enforce database standards, security policies, and compliance measures. · Create and maintain detailed documentation on PostgreSQL database configurations, procedures, and architectural decisions. · Develop automation scripts and tools to streamline database operations, including backups, maintenance, and monitoring. · Mentor junior database professionals and provide training on advanced PostgreSQL topics. · Stay current with PostgreSQL advancements and incorporate relevant technologies and methodologies to improve database systems. · Actively participate in Postgres communities and forums for the contribution to the Postgres ecosystem. Skills: · Proven experience in designing and managing large-scale PostgreSQL databases. · In-depth knowledge of PostgreSQL architecture, replication, partitioning, and indexing. · Proficiency in SQL and PL/pgSQL programming. · Strong understanding of database performance optimization techniques. · Experience with database monitoring tools (e.g., pgAdmin, Datadog, Nagios). · Familiarity with automation tools (e.g., Ansible, Terraform). · Hands-on in data warehousing and ETL processes. · Knowledge of data security best practices. · Excellent problem-solving and analytical skills. · Effective communication and interpersonal skills. · Experience with cloud database platforms (e.g., AWS RDS, Azure Database for PostgreSQL). · Familiarity with other database technologies (e.g., MySQL, Oracle) and data warehousing solutions. · Knowledge of DevOps practices and CI/CD pipelines. · Experience with automation tools and scripting languages (e.g., Python, Bash, Ansible). Qualifications: Education: Bachelor's or master's degree in computer science, Information Technology, or a related field. Summary · PostgreSQL Internals & Architecture: Deep expertise in PostgreSQL storage, indexing, query optimization, and memory management. · Performance Optimization: Proficiency in query tuning, indexing strategies, and database performance optimization techniques. · High Availability & Disaster Recovery: Experience with replication, clustering, and failover strategies for high availability. · Automation & Scripting: Experience in automation tools (Ansible, Terraform) and scripting languages (Python, Bash). · Cloud Platforms & DevOps: Experience with cloud database platforms (AWS RDS, Azure) and knowledge of DevOps practices and CI/CD pipelines. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Title: Organic Marketing Specialist - (AI Driven) DTC Brand Location: Pune - Full time office Job Exp: 3- 6 Years Timing: 3pm -11pm | Monday to Saturday (2nd & 4th Saturday Off) Joining: Immediate Website: https://www.globaldigitalonline.com/ Address: NIBM road, Pune About Us We’re a fast-scaling DTC brand built around performance, storytelling, and community. We blend human creativity with AI-driven systems to create a growth engine across organic and paid channels. Now, we’re looking for an Organic Marketing Specialist who can own organic performance end-to-end with an edge in leveraging AI tools and automation for faster, smarter execution. Your Mission Lead our organic SEO strategy with a bias for growth, innovation, and speed. Use traditional SEO fundamentals, but bring modern, AI-powered tactics to the table—from rapid content generation to entity-based optimization, search intent modelling, and programmatic SEO. Key Responsibilities Build and scale an AI-augmented SEO strategy across site structure, content, and backlinks Own technical SEO: page speed, crawlability, Core Web Vitals, structured data, indexing Use tools like ChatGPT, Surfer SEO, Jasper, Frase, or similar to speed up content workflows Plan, brief, and deploy long-form content, programmatic landing pages, and AI-generated clusters Collaborate with dev/UX on internal linking, mobile experience, and E-E-A-T Integrate search intent + first-party data into keyword selection and optimization Use entity-based SEO and semantic optimization to future-proof content Track and analyze performance using GA4, Search Console, Looker Studio, and other tools Oversee ethical, white-hat link-building and PR strategies (digital + influencer) Stay ahead of AI/SEO algorithm changes and test new formats, like voice search & SERP features Requirements 3–6 years in SEO (preferably DTC, eComm, or media-heavy brands) Experience with Shopify, Headless CMS, or custom DTC tech stacks Strong technical SEO knowledge and ability to run audits + implementation Deep familiarity with AI SEO tools, large language models, and workflow Proven track record of scaling organic traffic and increasing revenue through SEO Ability to write/edit or quality-check content at scale with human + AI workflows Strategic thinking + tactical execution (you’re both planner and doer) Bonus Skills Experience with programmatic SEO, schema mapping, or custom page builders Experience integrating organic and paid search strategies Prior work with eCommerce analytics stacks (GTM, GA4, Attribution, CRO tools) What You’ll Love Ownership of one of the most scalable growth channels in our DTC business A team that values experimentation, performance, and speed over perfection Tools and budget to build your own AI-powered SEO ecosystem Fast-track growth Show more Show less

Posted 2 weeks ago

Apply

100.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction About IBM IBM is a global technology and innovation company. It is the largest technology and consulting employer in the world, with presence in 170 countries. The diversity and breadth of the entire IBM portfolio of research, consulting, solutions, services, systems and software, unusually distinguishes IBM from other companies in the industry. Over the past 100 years, a lot has changed at IBM, in this new era of Cognitive Business, IBM is helping to reshape industries as diverse as healthcare, retail, banking, travel, manufacturing, and many more, by bringing together our expertise in Cloud, Analytics, Security, Mobile, and the Internet of Things. We like to say, “be essential.” We are changing how we craft. How we collaborate. How we analyse. How we engage. Join the next generation of innovators, inventors and entrepreneurs who are crafting the very way the world works. We want the brightest minds doing work that encourages, in an environment where growth is supported. IBMers get to discover their potential, so they’re inspired to build breakthroughs that help our clients succeed. We’re building teams with dynamic strengths with people who want their ideas to matter. Join us — you’ll be proud to call yourself an IBMer. Our Culture IBM is committed to crafting a diverse environment and is proud to be an equal opportunity employer. You will receive consideration for employment without regard to your race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. Business Unit Introduction India Systems Development Lab (ISDL) is part of IBM Systems world-wide technology development lab. Established in 1996, the Lab is headquartered in India's Silicon Valley and startup hub - Bengaluru, with a strong presence in Pune and Hyderabad. Developers at ISDL deliver technology innovations across the entire Systems portfolio and Storage. The team here works across the entire stack from processor design, firmware, operating system to software defined storage. The lab also focuses on innovations, thanks to the creative energies of the teams. The lab has contributed over 400+ patents in cutting edge technologies and inventions so far. While computing veers towards cognitive, cloud, mobile, social, and security, the lab has significantly contributed to not just new products focused in these areas, but has also ushered in new development models such as Agile, Design Thinking and DevOps. Your Role And Responsibilities We are seeking a highly skilled C/C++ expert with 8-10 years of product development, design, and support expertise, and a proven track record of managing databases, particularly with a focus on IBM DB2 administration, performance tuning, and debugging of database issues. As a vital member of our software development team, you will be responsible for designing, implementing, and optimizing database-related functionalities to ensure the high performance, stability, and reliability of our products. System Software Development and Maintenance: Code, design and implementation of C/C++ software components for backup and storage products, ensuring adherence to coding standards, best practices, and performance guidelines. Product Support: Provide technical expertise and support to customers and internal stakeholders regarding database-related inquiries and issues. Database Design: Collaborate with the development team to design and optimize database schemas, ensuring data integrity and scalability. System Engineering: Must have SAN storage background. Hands on experience in code, unit test and in the SDLC for any File Systems, Volume managers and/or Backup Recovery products. Troubleshooting and Debugging: Investigate and resolve complex database issues, working closely with cross-functional teams to identify root causes and implement effective solutions. Performance Engineering: Use performance tools like Flamegraph, Top, IfTop, Excel graphing, analytics. Performance Tuning: Proactively monitor and analyse database performance metrics to identify areas of optimization. Utilize your extensive experience in performance tuning techniques to enhance query execution and reduce response times. Continuous Improvement: Stay updated with the latest advancements in database technologies, C/C++ development practices, and system storage principles. Recommend and implement improvements to enhance product performance and maintainability. Tool kits: Be adept in various tools to parse logs like awk, Perl, python, grep etc Preferred Education Master's Degree Required Technical And Professional Expertise Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience). Extensive 3-7 years of experience in C/C++ software development, including product development and support. Strong expertise in managing IBM DB2 databases, including administration, performance tuning, and debugging of complex database issues. Strong Systems exposure analysing memory corruption, multithreaded designs and SAN architectures. Proven experience in optimizing database performance through indexing, query optimization, and caching strategies. Solid understanding of database design principles and best practices. Familiarity with data security and compliance standards. Excellent problem-solving and debugging skills to analyse and resolve complex technical issues. Strong communication and collaboration skills to work effectively in a team-oriented environment. Experience with other databases, such as SQL Server, Oracle, or MySQL, is a plus. Relevant certifications in C/C++ development, SAN-NAS certifications or database administration are advantageous. Tags: OS internals iSCSI, FC, Backup recovery, ZFS, LVM, NVMe, gdb, C/C++ at least one scripting language bash, Perl, python Required Education Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience) Preferred Education Master's Degree in Computer Science, Software Engineering Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Position Snowbit is a cybersecurity technology innovator with a vision to empower organizations across the globe to quickly, efficiently, and cost-effectively ready themselves to address omnipresent cyber risk. Built off years of Israeli cybersecurity experience, Snowbit is looking to offer the broadest managed detection and response offering available today. Snowbit is part of the Coralogix group, with Coralogix rebuilding the path to observability by offloading the burden of indexing and providing deep insights, at an infinite scale, for less than half the cost. We are looking for experienced and highly motivated Cloud Security Consultant to join our journey and be a part of the India Security Resource Centre (SRC) Team. Joining this team provides a unique opportunity to work in a global security resource center and benefit from the best of Israeli cybersecurity talent and influence the direction of a world-class offering in the cybersecurity domain as well as working closely with Coralogix leadership. Responsibilities Gain a deep understanding of Snowbit's security solutions, along with customer cloud environments and security architectures. Serve as the primary advocate and point of contact for Proof of Concepts (PoCs) and high-profile customer engagements, ensuring successful outcomes. Ensure the efficient implementation and enforcement of advanced security services provided by the team. Proactively identify and anticipate potential security escalations at early stages, mitigating risks effectively. Manage multiple high-priority tasks, including handling escalations with a strategic approach. Oversee the triage and analysis of security assessments, enhancing incident investigation processes and ensuring swift resolution of security threats. Communicate with customers on alerts, remediation actions, and incident response using structured playbook-driven solutions. Lead quarterly reviews and PoC evaluation calls, while taking ownership of key initiatives and coordinating with various stakeholders. Collaborate with cross-functional teams—including Customer Success Management (CSM), Security Research, Incident Response, and Product—to strengthen security operations and continuously enhance the Snowbit offering. Share industry insights, best practices, and technical knowledge within the team while keeping internal documentation and knowledge bases up to date. Foster a collaborative and growth-driven team culture by leading by example, supporting professional development, and promoting continuous learning and innovation. Requirements Availability during US time zone 5PM - 2AM IST in order to allow daily interaction with US based customers Bachelor's degree in Computer Science, Engineering, Electrical Engineering, or relevant industry certifications. Strong communication skills with proficiency in English (written and verbal). Experience working with multi-regional customers across different locations. 3+ years of experience in customer-facing security operations roles, preferably within SOC/MDR environments handling multiple clients. Expertise in cybersecurity assessments and incident management methodologies. Hands-on experience with security technologies such as SOC, MDR, SIEM, SOAR, WAF, IPS and other security solutions. Strong understanding of the cybersecurity landscape, including common threats, attack vectors, and mitigation strategies. Familiarity with monitoring, ticketing, and CRM tools to manage security operations efficiently. Ability to build and maintain strong customer relationships, ensuring a positive and proactive engagement experience. Preferred Requirements Experience with cloud services (AWS or Azure or GCP) Knowledge of Cloud security principles Security certifications such as CISSP,CEH,CSA Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

We are looking an immmediate joiner for Delhi Location(WFO). Job Description: PostgreSQL DBA Job Summary: We are looking for an experienced PostgreSQL Database Administrator (DBA) to lead our database administration and data migration activities, specifically for CRM applications deployed on cloud instances. The ideal candidate will have a strong background in PostgreSQL administration, data migration, and cloud infrastructure, and will play a key role in ensuring database performance, security, and availability during the cloud migration process. Key Responsibilities: · Database Administration: · Manage and maintain PostgreSQL databases across cloud instances, ensuring high availability, performance, and security. · Perform routine database administration tasks such as monitoring, tuning, and backups. · Implement and maintain database clustering, replication, and partitioning strategies to support scalability and performance. · Data Migration: · Lead the data migration process from on-premise CRM applications to cloud-based PostgreSQL instances. · Develop and execute data migration plans, including data mapping, data transformation, and validation strategies. · Ensure data integrity and minimal downtime during migration activities. · Performance Tuning and Optimization: · Monitor database performance and optimize queries, indexing, and schema design to improve performance. · Proactively identify performance bottlenecks and resolve them to ensure smooth database operations. · Implement database monitoring tools and alerts to quickly detect and address issues. · Cloud Instance Management: · Work with cloud platforms (such as GCP, AWS, or Azure) to manage PostgreSQL instances in cloud environments. · Ensure the cloud database environment adheres to security and compliance standards. · Implement best practices for cloud-based database backup, disaster recovery, and high availability. · Security & Compliance: · Implement database security measures, including access controls, encryption, and auditing. · Ensure compliance with industry standards and organizational policies related to data security and privacy. · Collaboration & Documentation: · Collaborate with application teams, DevOps, and cloud engineers to ensure seamless integration between databases and applications. · Document all database configurations, procedures, and migration plans. · Provide technical support and guidance to internal teams regarding database best practices. Skills and Qualifications: · Technical Skills: · 3-5 years of experience working as a PostgreSQL DBA, with strong experience managing databases in cloud environments. · Experience with cloud platforms such as Google Cloud Platform (GCP), AWS, or Azure for managing PostgreSQL databases. · Proficiency in data migration from on-premise databases to cloud instances. · Strong knowledge of database performance tuning, query optimization, and indexing strategies. · Experience with high availability and disaster recovery configurations, such as replication and clustering. · Familiarity with security best practices for databases, including encryption, role-based access control (RBAC), and auditing. · Soft Skills: · Strong problem-solving and analytical skills to diagnose and resolve database-related issues. · Excellent communication skills for collaborating with cross-functional teams. · Ability to work independently and manage multiple tasks in a fast-paced environment. · Certifications (Preferred but not mandatory): · Certified PostgreSQL DBA or similar certifications. · Cloud certifications related to database management, such as AWS Certified Database – Specialty or Google Cloud Professional Database Engineer, are a plus. Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Relevant certifications in database management or cloud platforms are a plus. Additional Information: Opportunity to lead large-scale data migration projects for CRM applications on cloud infrastructure. Work with cutting-edge cloud technologies and collaborate with cross-functional teams. Show more Show less

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

What We Offer At Magna, you can expect an engaging and dynamic environment where you can help to develop industry-leading automotive technologies. We invest in our employees, providing them with the support and resources they need to succeed. As a member of our global team, you can expect exciting, varied responsibilities as well as a wide range of development prospects. Because we believe that your career path should be as unique as you are. Group Summary Transforming mobility. Making automotive technology that is smarter, cleaner, safer and lighter. That’s what we’re passionate about at Magna Powertrain, and we do it by creating world-class powertrain systems. We are a premier supplier for the global automotive industry with full capabilities in design, development, testing and manufacturing of complex powertrain systems. Our name stands for quality, environmental consciousness, and safety. Innovation is what drives us and we drive innovation. Dream big and create the future of mobility at Magna Powertrain. Job Responsibilities Company Introduction At Magna, we create technology that disrupts the industry and solves big problems for consumers, our customers, and the world around us. We’re the only mobility technology company and supplier with complete expertise across the entire vehicle. We are committed to quality and continuous improvement because our products impact millions of people every day. But we’re more than what we make. We are a group of entrepreneurial-minded people whose collective expertise gives us a competitive advantage. World Class Manufacturing is a journey and it’s our talented people who lead us on this journey. Job Introduction In this challenging and interesting position, you are the expert for all topics related to databases. You will be part of an international team, that ensures the smooth and efficient operation of various database systems, including Microsoft SQL Server, Azure SQL, Oracle, DB2, MariaDB, and PostgreSQL. Your responsibilities include providing expert support for database-related issues, troubleshooting problems promptly, and collaborating with users and business stakeholders to achieve high customer satisfaction. Your expertise in cloud database services and general IT infrastructure will be crucial in supporting the development of the future data environment at Magna Powertrain. Major Responsibilities Responsible for ensuring the smooth and efficient operation of all database systems, including but not limited to Microsoft SQL Server, Azure SQL, Oracle, DB2, MariaDB, PostgreSQL. Provide expert support for database-related issues, troubleshoot and resolve problems quickly as they arise to ensure minimal disruption. Deliver professional assistance for database-related requests, working collaboratively with users and business stakeholders to achieve high customer satisfaction. Manage the installation, implementation, configuration, administration and decommission of database systems. Plan and execute database upgrades, updates, migrations, and implement changes, new patches and versions when required. Monitor database systems, database activities and overall database performance proactively, to identify issues and implement solutions to optimize performance. Develop and implement backup and recovery strategies, execute backups and restores to ensure data integrity and availability across all database systems. Perform database tuning and optimization, including indexing, query optimization, and storage management. Implement and maintain database security measures, including user access controls, encryption, and regular security audits to protect sensitive data from unauthorized access and breaches. Create and maintain proper documentation for all database systems and processes. Ensure constant evaluation, analysis and modernization of the database systems. Knowledge and Education Bachelor’s degree in computer science / information technology, or equivalent (Master’s preferred). Work Experience Minimum 8-10 years of proven experience as a database administrator in a similar position. Excellent verbal and written communication skills in English. German language skills are optional, but of advantage. Skills and Competencies We Are Looking For a Qualified Person With In-depth expertise of database concepts, theory and best practices including but not limited to high availability/clustering, replication, indexing, backup and recovery, performance tuning, database security, data integrity, data modeling and query optimization. Expert knowledge of Microsoft SQL Server and its components, including but not limited to Failover Clustering, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), and SQL Server Analysis Services (SSAS). Excellent knowledge of various database management systems, including but not limited to Oracle, IBM DB2, MariaDB and PostgreSQL. Familiarity with further database management systems (e.g. MySQL, MongoDB, Redis, etc.) is an advantage. Extensive expertise about Microsoft Azure database services (Azure SQL Databases, Azure SQL Managed Instances, SQL Server on Azure VMs). Proficiency with other major cloud platforms such as AWS or Google Cloud, as well as experience with their cloud database services (e.g. Amazon RDS, Google Cloud SQL) are an advantage. Comprehensive understanding of cloud technologies, including but not limited to cloud architecture, cloud service models and cloud security best practices. Good general knowledge about IT infrastructure, networking, firewalls and storage systems. High proficiency in T-SQL and other query languages. Knowledge of other scripting languages are an advantage (e.g. Python, PowerShell, Visual Basic, etc.). Experience with Databricks and similar data engineering tools for big data processing, analytics, and machine learning are an advantage. A working knowledge of Microsoft Power Platform tools including PowerApps, Power Automate, and Power BI is an advantage. Excellent analytical and problem-solving skills and strong attention to detail. Ability to work effectively in an intercultural team, strong organizational skills, and high self-motivation. Work Environment Regular overnight travel 10-25% of the time For dedicated and motivated employees, we offer an interesting and diversified job within a dynamic global team together with the individual and functional development in a professional environment of a global acting business. Fair treatment and a sense of responsibility towards employees are the principle of the Magna culture. We strive to offer an inspiring and motivating work environment. Additional Information We offer attractive benefits (e.g., employee profit participation program) and a salary which is in line with market conditions depending on your skills and experience. Awareness, Unity, Empowerment At Magna, we believe that a diverse workforce is critical to our success. That’s why we are proud to be an equal opportunity employer. We hire on the basis of experience and qualifications, and in consideration of job requirements, regardless of, in particular, color, ancestry, religion, gender, origin, sexual orientation, age, citizenship, marital status, disability or gender identity. Magna takes the privacy of your personal information seriously. We discourage you from sending applications via email to comply with GDPR requirements and your local Data Privacy Law. Worker Type Regular / Permanent Group Magna Powertrain Show more Show less

Posted 2 weeks ago

Apply

4.0 - 6.0 years

3 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

India - Hyderabad JOB ID: R-216342 LOCATION: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: May. 30, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architecture. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 4to 6 years of Computer Science, IT or related field experience OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Preferred Qualifications: Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and collaboration skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderābād

Remote

GlassDoor logo

GCP Cloud Architect We’re looking for a Senior GCP Cloud Architect to design and manage scalable cloud infrastructure with a strong focus on GKE and related services. This role involves leading architecture, optimizing cloud-native applications, and collaborating closely with DevOps teams to maintain a robust and automated environment. Experience Level: Senior / Architect Location: Hyderabad / Mumbai – May consider Remote with ocassional Travel arrangements Duration: Long-term Assignment Key Responsibilities: Design and architect scalable GCP environments, especially using Google Kubernetes Engine (GKE). Manage core GCP services including Cloud SQL (PostgreSQL), GCS, Artifact Registry, and Source Repositories. Ensure performance, availability, and upgrade management of GKE clusters. Optimize PostgreSQL performance through tuning, indexing, and troubleshooting. Develop and maintain CI/CD pipelines using GitHub Actions. Implement Infrastructure as Code using Terraform, Bash, and YAML. Collaborate with stakeholders and lead infrastructure-related decisions. Enable seamless deployments and hot configuration updates for containerized applications. Required Skills & Experience: 5+ years in cloud architecture roles with hands-on GCP expertise. Deep experience with GKE, Cloud SQL (PostgreSQL), and container orchestration. Strong scripting and automation skills (Terraform, Bash, YAML). Proven DevOps experience with CI/CD (GitHub Actions preferred). Excellent communication and stakeholder engagement skills. Nice to Have: Experience with dynamic configuration changes without pod restarts. Understanding of GCP security best practices and IAM. Prior work on data archival strategies and package management workflows. Job Type: Contract Full Time Job Location: Hyderabad

Posted 2 weeks ago

Apply

0 years

3 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Location Hyderabad, Telangana, India Category News & Editorial Careers Job Id JREQ190761 Job Type Full time Hybrid Job Description Summary: Analyse, create, and deliver content that enhances the research value to customers of Thomson Reuters legal information, within parameters established by management and/or editors. Act as a resource for junior staff in handling both substantive content and production issues related to responsibilities Commentary Law: As a member of the Global Commentary editorial team, you will work closely with other teams and individuals to collaborate on your work. Relationship management, meeting deadlines, and strong editorial skills are necessary to maintain our indexes as part of our world-class commentary collection. About the Role: In this role, you will provide legal interpretation to create or update existing indexes for Thomson Reuters Legal in all media. Members of this team are also tasked with ensuring that published indexes contain and deliver appropriate legal concepts and terms to create a quality finding aid for our customers. There are several key responsibilities of the Attorney Editor role: Create back-of-the-book indexes: Our Attorney Editors generate back-of-the-book indexes by creating original content or by updating an existing index. When applicable they incorporate their index into an existing general index. They follow the required style guidelines and deliver within extremely strict and tight time deadlines. Editorial Responsibility: Members of this team adhere to editorial publishing schedules and quality standards. They also verify their own work for accuracy and completeness and ensure that the information is properly presented and organized. In this role, attention to detail, knowledge of legal concepts and “terms of art” of how legal concepts fit together, and the ability to synthesize complex material are required. Individual and Leadership Development: As a part of a larger team, you will attend and participate in meetings and take part in the employee evaluation process (both wider feedback and self-evaluation). To be most effective at your job, you will develop knowledge of Thomson Reuters Legal publications and products related to your responsibilities, and those of competitors. Members of our team prioritize their own work and understand the roles and responsibilities of each collaborator/team member and how each role impacts production. You will also be required to demonstrate competence on company-specific systems necessary to perform your job functions. About you: Education: Legal degree from an accredited law school. Self-starter with an aptitude for legal concepts and “terms of art” of how legal concepts fit together Analytical thinker who uses logic and collaborates to solve difficult problems Decisive with a focus on making quality decisions quickly Driven by deadlines and can deliver results Ability to interpret, analyze, organize, and communicate complex legal material. Comfortable with personal computers and familiar with word processing and online applications Can prioritize tasks and projects and pursue them with energy and drive Great teammate who will work successfully in a shared environment. Strong communicator who can speak and write clearly and effectively with all contacts, both in and outside Thomson Reuters. #LI-AM1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Vellore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Madurai, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Faridabad, Haryana, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

Exploring Indexing Jobs in India

The indexing job market in India has been growing steadily over the past few years, with an increasing demand for professionals skilled in data organization and management. Indexing roles are crucial in various industries such as publishing, research, and information technology. In this article, we will explore the job opportunities, salary ranges, career paths, related skills, and interview questions for indexing roles in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

These cities are known for their vibrant job markets and have a high demand for indexing professionals.

Average Salary Range

The average salary range for indexing professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.

Career Path

A typical career progression in indexing roles may include: - Indexing Associate - Indexing Specialist - Senior Indexing Analyst - Indexing Manager

With experience and additional certifications, professionals can move up the ladder to managerial roles within the indexing field.

Related Skills

In addition to indexing skills, professionals in this field are often expected to have knowledge of: - Data management - Information retrieval systems - Database management - Advanced Excel skills - Attention to detail

Interview Questions

  • What is indexing and how is it different from sorting? (basic)
  • Can you explain the importance of indexing in database management? (medium)
  • How would you handle a large dataset for indexing purposes? (medium)
  • What are the different types of indexing techniques? (advanced)
  • Can you discuss the challenges faced in maintaining an index? (advanced)
  • How would you optimize indexing performance in a database system? (advanced)
  • Explain the concept of clustered and non-clustered indexing. (medium)
  • What is the role of primary keys in indexing? (basic)
  • How does indexing impact query performance? (advanced)
  • Can you explain the concept of index fragmentation? (medium)
  • How do you troubleshoot indexing-related issues in a database? (medium)
  • What are the best practices for creating an efficient index? (advanced)
  • How do you decide which columns to include in an index? (medium)
  • Describe the process of index reorganization in a database. (medium)
  • How do you monitor and maintain index health in a database system? (advanced)
  • Can you discuss the differences between full-text indexing and regular indexing? (medium)
  • How would you handle indexing for a multi-lingual database? (advanced)
  • What are the limitations of indexing in a database system? (medium)
  • How do you approach index design for a new database project? (medium)
  • Can you explain the impact of indexing on storage requirements? (basic)
  • How do you ensure data consistency when creating and updating indexes? (medium)
  • Describe a challenging indexing project you have worked on and how you resolved it. (advanced)
  • How do you stay updated on the latest trends and technologies in indexing? (medium)
  • Can you discuss the role of indexing in big data analytics? (advanced)

Closing Remark

As you prepare for indexing roles in India, remember to showcase your expertise in data organization and management. Stay updated on industry trends and technologies to stand out in the competitive job market. With the right skills and preparation, you can confidently apply for indexing roles and advance your career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies