Home
Jobs

2278 Encryption Jobs - Page 49

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are seeking an experienced GoAnywhere Administrator with over 5 years of expertise in managing and optimizing GoAnywhere Managed File Transfer (MFT) solutions. The ideal candidate will be responsible for ensuring the secure, reliable, and efficient transfer of data across our systems and with our partners. This role requires a deep understanding of MFT processes, security protocols, and compliance requirements. Key Responsibilities : System Administration: Oversee the installation, configuration, and maintenance of GoAnywhere MFT systems to ensure operational efficiency and security. Workflow Design and Automation: Develop and manage automated workflows for file transfers, leveraging GoAnywhere's features to streamline operations. Security Management: Implement and maintain security policies, including encryption, access controls, and secure transfer protocols (SFTP, FTPS, HTTPS). User and Role Management: Create and manage user accounts and roles, ensuring appropriate access levels and permissions. Monitoring and Troubleshooting: Monitor file transfer activities, resolve issues, and optimize performance. Provide support for incident and problem management. Compliance and Reporting: Ensure that all file transfer activities comply with relevant regulations and standards. Generate detailed reports for audits and compliance checks. Collaboration: Work closely with IT teams, business units, and external partners to understand requirements and deliver effective file transfer solutions. Documentation: Maintain comprehensive documentation of systems, processes, and procedures for internal reference and training purposes. Mandatory Requirements : GoAnywhere Administrator with over 5 years of expertise in managing and optimizing GoAnywhere Managed File Transfer (MFT) solutions. Notice Period: Immediate- 30 days Relevant Experience Required: 5-10 years Location: Bangalore ,Hyderabad, Mumbai, Kolkata, Gurgaon, Chennai, Noida Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction IBM Security Verify is placed in Gartner Leadership Quadrant as a cloud-based Identity and Access Management (IAM) solution that helps organizations manage user identities and access to applications and resources. It provides features like multi-factor authentication, single sign-on, risk-based authentication, and adaptive access as well as user lifecycle journeys along with associated governance, aiming to protect customer, workforce, and privileged identities. The solution also offers identity analytics to provide insights into user behavior and potential risks. Your Role And Responsibilities Contribute to the development and maintenance of reusable React-based UI features and components integrated with backend APIs in a cloud-native environment. Support end-to-end feature development, contribute ideas in team discussions, and grow your technical expertise through hands-on experience and mentorship from senior team members. Implement and maintain test automation to ensure product reliability. Collaborate with tech leads & senior developers to understand requirements and deliver clean, maintainable code. Participate in code reviews, testing, and debugging to ensure high-quality software delivery. Support the team in setting up and troubleshooting development and test environments. Follow Agile methodologies and contribute to sprint planning, daily stand-ups, and retrospectives. Learn and adopt best practices in coding, testing, and software design through mentorship and hands-on experience. Continuously improve skills by staying current with industry trends, tools, and technologies. Preferred Education Master's Degree Required Technical And Professional Expertise 3+ years of experience developing web-based applications using React, Typescript, JavaScript, HTML5, and CSS3. Familiarity with RESTful APIs Familiarity with CI/CD tools like Git, GitHub, or Jenkins, and experience in source control workflows. Exposure to cloud platforms such as AWS, Azure, or RedHat OpenShift (OCP) — especially deploying basic applications or using managed services. Understanding of Docker and basic containerization concepts. Experience writing unit and integration tests using tools such as JUnit or Selenium, Cucumber. Familiarity with debugging tools and browser-based dev tools for frontend development. Exposure to Agile software development processes like Scrum or Kanban. Good communication skills, strong problem solving skills and willingness to collaborate with team members and learn from senior developers. Preferred Technical And Professional Experience Exposure to backend architectural concepts such as microservices or MVC patterns. Familiarity with accessibility standards (e.g., WCAG, Section 508) is a plus. Experience or coursework in security concepts, including encryption, secure coding, or authentication frameworks. Basic understanding of security best practices, including privacy by design principles. Exposure to scripting languages such as Shell or platforms like Node.js is desirable. Experience working with UI component libraries or design systems (e.g., Carbon, Material UI) is a plus. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

We are looking an immmediate joiner for Delhi Location(WFO). Job Description: PostgreSQL DBA Job Summary: We are looking for an experienced PostgreSQL Database Administrator (DBA) to lead our database administration and data migration activities, specifically for CRM applications deployed on cloud instances. The ideal candidate will have a strong background in PostgreSQL administration, data migration, and cloud infrastructure, and will play a key role in ensuring database performance, security, and availability during the cloud migration process. Key Responsibilities: · Database Administration: · Manage and maintain PostgreSQL databases across cloud instances, ensuring high availability, performance, and security. · Perform routine database administration tasks such as monitoring, tuning, and backups. · Implement and maintain database clustering, replication, and partitioning strategies to support scalability and performance. · Data Migration: · Lead the data migration process from on-premise CRM applications to cloud-based PostgreSQL instances. · Develop and execute data migration plans, including data mapping, data transformation, and validation strategies. · Ensure data integrity and minimal downtime during migration activities. · Performance Tuning and Optimization: · Monitor database performance and optimize queries, indexing, and schema design to improve performance. · Proactively identify performance bottlenecks and resolve them to ensure smooth database operations. · Implement database monitoring tools and alerts to quickly detect and address issues. · Cloud Instance Management: · Work with cloud platforms (such as GCP, AWS, or Azure) to manage PostgreSQL instances in cloud environments. · Ensure the cloud database environment adheres to security and compliance standards. · Implement best practices for cloud-based database backup, disaster recovery, and high availability. · Security & Compliance: · Implement database security measures, including access controls, encryption, and auditing. · Ensure compliance with industry standards and organizational policies related to data security and privacy. · Collaboration & Documentation: · Collaborate with application teams, DevOps, and cloud engineers to ensure seamless integration between databases and applications. · Document all database configurations, procedures, and migration plans. · Provide technical support and guidance to internal teams regarding database best practices. Skills and Qualifications: · Technical Skills: · 3-5 years of experience working as a PostgreSQL DBA, with strong experience managing databases in cloud environments. · Experience with cloud platforms such as Google Cloud Platform (GCP), AWS, or Azure for managing PostgreSQL databases. · Proficiency in data migration from on-premise databases to cloud instances. · Strong knowledge of database performance tuning, query optimization, and indexing strategies. · Experience with high availability and disaster recovery configurations, such as replication and clustering. · Familiarity with security best practices for databases, including encryption, role-based access control (RBAC), and auditing. · Soft Skills: · Strong problem-solving and analytical skills to diagnose and resolve database-related issues. · Excellent communication skills for collaborating with cross-functional teams. · Ability to work independently and manage multiple tasks in a fast-paced environment. · Certifications (Preferred but not mandatory): · Certified PostgreSQL DBA or similar certifications. · Cloud certifications related to database management, such as AWS Certified Database – Specialty or Google Cloud Professional Database Engineer, are a plus. Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Relevant certifications in database management or cloud platforms are a plus. Additional Information: Opportunity to lead large-scale data migration projects for CRM applications on cloud infrastructure. Work with cutting-edge cloud technologies and collaborate with cross-functional teams. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Description Want to be a part of start-up environment within Amazon to design and build a new Fintech Payments product right from scratch? Want to enable hundreds of millions of Amazon customers to shop on Amazon using next generation credit products? Want to be a part of the team that will enable you to deliver products handling highly sensitive customer data, at high traffic and minimum latency while handling cross-region calls if required? Want to be a part of the team that will enable you to learn latest technologies and skills relevant in the software development industry? Amazon India Payments team is looking for software developers who are passionate to design and build the next generation Payments product from the ground up. Once built this highly reliable and scalable product will provide a new payment gateway to hundreds of millions of Amazon India customers. The team will require learning and using latest AWS technologies including: AWS Dacia, AWS Kinesis, Lambda, SNS, SQS, Server side encryption on DynamoDB using client managed keys, API Gateways, AWS VPC, AWS NLB, Cloud trail, Elastic search, etc. Additionally the team also provide opportunities to learn and work on Machine learning, interacting and influencing Amazon third party partners like Banks (SBI, HDFC, etc.), and PAs like PayU, BillDesk etc. The platform will be designed to support other emerging economies having similar requirements and the role provides a huge opportunity for the developers to build a strong portfolio of patents for Amazon. Developers in the team need to have a strong understanding of computer fundamentals and preferably experience in building large scale distributed systems. Key job responsibilities Key Responsibilities include:- Ability to code right solutions starting with broadly defined problems, Understand basic Algorithm fundamentals Development of code in object oriented languages like C++ and java and build large scale robust distributed systems A day in the life Why join us? Innovation: We are working on key end-customer facing innovations like OTP reader and auto submit, CVV less transaction, OTP less, best-in-industry Processing latency, unified recurring payments selection & processing, etc. The ambiguity and complexity of these areas provide endless opportunities to innovate on behalf of our customers. Learning: We work on a very diverse problem space ranging from client-side innovations to fault tolerant Tier1 distributed systems to working with various industry partners. Itm involves discussion with multiple partners in payment ecosystem like processors, acquirers, networks like Visa/Mastercard/RuPay/AMEX/Diners and Issuers, which helps to understand end to end functioning of payment industry. We encourage and facilitate rotation of engineers in the team across these areas for multifaceted growth. Growth: We have multiple large initiatives in pipeline for H2 2023 like CVV less, 3DS 2.0 optimization, Direct connectivity, Real Time monitoring etc. We are looking for SDEs to join us and lead these ambitious projects. We have multiple Sr SDEs and PEs working and overseeing projects in this space to nurture and guide junior engineers. The managers in our team collaborate with each team member on planning and tracking career growth via goal setting, frequent career connects, individual mentors and proactive goal/work summary assessments. Fun: Despite the hurdles created by fully remote work environment, we try to find avenues to have fun with teammates while working hard and making history. About The Team IN Payments - Cards & Netbanking team's vision is to provide a best-in-class, intuitive and seamless payment and acceptance experience to all merchants. Our mission is to reduce friction from payments by minimizing steps/clicks, provide multiple connectivity pipes to reduce latency/COP and provide reusable solutions for tokenization, authentication across marketplaces and to external merchants. Overall the Mission is to make Cards Payments simple, secure, reliable and most rewarding for every user in India. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2959376 Show more Show less

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

What We Offer At Magna, you can expect an engaging and dynamic environment where you can help to develop industry-leading automotive technologies. We invest in our employees, providing them with the support and resources they need to succeed. As a member of our global team, you can expect exciting, varied responsibilities as well as a wide range of development prospects. Because we believe that your career path should be as unique as you are. Group Summary Transforming mobility. Making automotive technology that is smarter, cleaner, safer and lighter. That’s what we’re passionate about at Magna Powertrain, and we do it by creating world-class powertrain systems. We are a premier supplier for the global automotive industry with full capabilities in design, development, testing and manufacturing of complex powertrain systems. Our name stands for quality, environmental consciousness, and safety. Innovation is what drives us and we drive innovation. Dream big and create the future of mobility at Magna Powertrain. Job Responsibilities Company Introduction At Magna, we create technology that disrupts the industry and solves big problems for consumers, our customers, and the world around us. We’re the only mobility technology company and supplier with complete expertise across the entire vehicle. We are committed to quality and continuous improvement because our products impact millions of people every day. But we’re more than what we make. We are a group of entrepreneurial-minded people whose collective expertise gives us a competitive advantage. World Class Manufacturing is a journey and it’s our talented people who lead us on this journey. Job Introduction In this challenging and interesting position, you are the expert for all topics related to databases. You will be part of an international team, that ensures the smooth and efficient operation of various database systems, including Microsoft SQL Server, Azure SQL, Oracle, DB2, MariaDB, and PostgreSQL. Your responsibilities include providing expert support for database-related issues, troubleshooting problems promptly, and collaborating with users and business stakeholders to achieve high customer satisfaction. Your expertise in cloud database services and general IT infrastructure will be crucial in supporting the development of the future data environment at Magna Powertrain. Major Responsibilities Responsible for ensuring the smooth and efficient operation of all database systems, including but not limited to Microsoft SQL Server, Azure SQL, Oracle, DB2, MariaDB, PostgreSQL. Provide expert support for database-related issues, troubleshoot and resolve problems quickly as they arise to ensure minimal disruption. Deliver professional assistance for database-related requests, working collaboratively with users and business stakeholders to achieve high customer satisfaction. Manage the installation, implementation, configuration, administration and decommission of database systems. Plan and execute database upgrades, updates, migrations, and implement changes, new patches and versions when required. Monitor database systems, database activities and overall database performance proactively, to identify issues and implement solutions to optimize performance. Develop and implement backup and recovery strategies, execute backups and restores to ensure data integrity and availability across all database systems. Perform database tuning and optimization, including indexing, query optimization, and storage management. Implement and maintain database security measures, including user access controls, encryption, and regular security audits to protect sensitive data from unauthorized access and breaches. Create and maintain proper documentation for all database systems and processes. Ensure constant evaluation, analysis and modernization of the database systems. Knowledge and Education Bachelor’s degree in computer science / information technology, or equivalent (Master’s preferred). Work Experience Minimum 8-10 years of proven experience as a database administrator in a similar position. Excellent verbal and written communication skills in English. German language skills are optional, but of advantage. Skills and Competencies We Are Looking For a Qualified Person With In-depth expertise of database concepts, theory and best practices including but not limited to high availability/clustering, replication, indexing, backup and recovery, performance tuning, database security, data integrity, data modeling and query optimization. Expert knowledge of Microsoft SQL Server and its components, including but not limited to Failover Clustering, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), and SQL Server Analysis Services (SSAS). Excellent knowledge of various database management systems, including but not limited to Oracle, IBM DB2, MariaDB and PostgreSQL. Familiarity with further database management systems (e.g. MySQL, MongoDB, Redis, etc.) is an advantage. Extensive expertise about Microsoft Azure database services (Azure SQL Databases, Azure SQL Managed Instances, SQL Server on Azure VMs). Proficiency with other major cloud platforms such as AWS or Google Cloud, as well as experience with their cloud database services (e.g. Amazon RDS, Google Cloud SQL) are an advantage. Comprehensive understanding of cloud technologies, including but not limited to cloud architecture, cloud service models and cloud security best practices. Good general knowledge about IT infrastructure, networking, firewalls and storage systems. High proficiency in T-SQL and other query languages. Knowledge of other scripting languages are an advantage (e.g. Python, PowerShell, Visual Basic, etc.). Experience with Databricks and similar data engineering tools for big data processing, analytics, and machine learning are an advantage. A working knowledge of Microsoft Power Platform tools including PowerApps, Power Automate, and Power BI is an advantage. Excellent analytical and problem-solving skills and strong attention to detail. Ability to work effectively in an intercultural team, strong organizational skills, and high self-motivation. Work Environment Regular overnight travel 10-25% of the time For dedicated and motivated employees, we offer an interesting and diversified job within a dynamic global team together with the individual and functional development in a professional environment of a global acting business. Fair treatment and a sense of responsibility towards employees are the principle of the Magna culture. We strive to offer an inspiring and motivating work environment. Additional Information We offer attractive benefits (e.g., employee profit participation program) and a salary which is in line with market conditions depending on your skills and experience. Awareness, Unity, Empowerment At Magna, we believe that a diverse workforce is critical to our success. That’s why we are proud to be an equal opportunity employer. We hire on the basis of experience and qualifications, and in consideration of job requirements, regardless of, in particular, color, ancestry, religion, gender, origin, sexual orientation, age, citizenship, marital status, disability or gender identity. Magna takes the privacy of your personal information seriously. We discourage you from sending applications via email to comply with GDPR requirements and your local Data Privacy Law. Worker Type Regular / Permanent Group Magna Powertrain Show more Show less

Posted 2 weeks ago

Apply

15.0 years

0 Lacs

India

On-site

Linkedin logo

Elastic, the Search AI Company, enables everyone to find the answers they need in real time, using all their data, at scale — unleashing the potential of businesses and people. The Elastic Search AI Platform, used by more than 50% of the Fortune 500, brings together the precision of search and the intelligence of AI to enable everyone to accelerate the results that matter. By taking advantage of all structured and unstructured data — securing and protecting private information more effectively — Elastic’s complete, cloud-based solutions for search, security, and observability help organizations deliver on the promise of AI. What Is The Role As a Security Specialist Solutions Architect, you will bring deep expertise in SIEM solutions, security technologies, and consultative selling to support customers in securing their data and infrastructure. You will leverage your knowledge of vendor ecosystems, security architectures, and operational processes to help customers solve complex challenges, enhance incident response, and build modern, scalable security solutions. Joining Elastic Means Joining a Company That: Empowers you to make a real impact: Your work will directly influence the success of organizations worldwide, helping them unlock new opportunities and make insightful decisions. Offers unparalleled growth opportunities: We cultivate a culture of learning and provide access to world-class training programs and cutting-edge technology. Connects you with passionate individuals: Collaborate with a diverse team of engineers, data scientists, and industry experts who are passionate about pushing the boundaries of security analytics and AI-powered solutions. Rewards you for your contributions: Enjoy a competitive compensation and benefits package that reflects your talent and dedication. Learn More: Our Mission: To help people understand their data and align with their desired outcomes Our Values: Collaboration, Openness, Customer Obsession, and Innovation. Our Culture: We're a diverse team of passionate individuals united by a common goal: to make the world a more informed place. What You Will Be Doing Architect next-generation security solutions: Design and implement cutting-edge solutions utilizing Elastic's latest advancements in AI, to help improve incident response Harness the power of the Elastic Security Solution to enable organizations to achieve their use case requirements and improve overall operational efficiency Deliver compelling proof-of-value (POV) projects: Showcase the value of Elastic Security’s detection, AI, and Incident Response capabilities, demonstrating the potential to improve the analyst experience and shorten the time it takes for an analyst to make an informed decision. Be a thought leader and evangelize our vision: Share your expertise through presentations, blog posts, and other channels, positioning Elastic as the leader in SIEM and Analytics Foster collaboration and knowledge sharing: Cultivate a collaborative environment within the Solutions Architecture team and across departments, promoting knowledge sharing and driving continuous improvement. Stay ahead of the curve: Continuously learn and explore the ever-evolving landscapes of SIEM and Analytics and related technologies, ensuring our solutions are always at the forefront of innovation. What You Bring 15+ years of experience in designing and architecting enterprise-level SIEM solutions. Extensive experience with SIEM platforms and vendor solutions in enterprise environments Deep understanding of AI and its impact on SIEM and Analytics solutions. Strong understanding of SOC workflows, processes, and operational challenges. Knowledge of a wide range of security solutions, the type of data they produce and build content to satisfy Use Case Requirements Background that includes forensic analysis, troubleshooting, and threat mitigation. Experience with Behavioral Analytics, and Machine Learning techniques. Ability to work independently and thrive in fast-paced environments. Proven success in consultative selling, RFI/RFP responses, and customer presentations. Excellent communication and presentation skills, able to engage both technical and non-technical audiences Collaborative spirit and a passion for sharing knowledge and expertise. Unwavering commitment to continuous learning and staying ahead of the curve in technology and emerging trends Certifications such as CISSP, CEH, or GIAC are a plus Willingness to travel 30-50% of the time What Is The Role As a Security Specialist Solutions Architect, you will bring deep expertise in SIEM solutions, security technologies, and consultative selling to support customers in securing their data and infrastructure. You will use your knowledge of supplier networks, security systems, and daily operations to help customers. Your goal is to solve complex problems, enhance emergency response, and develop modern, scalable security systems. Joining Elastic Means Joining a Company That: Empowers you to make a real impact: Your work will directly influence the success of organizations worldwide, helping them unlock new opportunities and make insightful decisions. Offers unparalleled growth opportunities: We cultivate a culture of learning and provide access to world-class training programs and cutting-edge technology. Work with a team of devoted engineers, data scientists, and industry experts. You will collaboratively push the boundaries of security analytics and AI-powered solutions. Rewards you for your contributions: Enjoy a competitive compensation and benefits package that reflects your talent and dedication. Learn More: Our Mission: To help people understand their data and align with their desired outcomes Our Values: Collaboration, Openness, Customer Obsession, and Innovation. Our Culture: We're a diverse team of motivated individuals united by a common goal: to make the world a more informed place Create advanced security solutions: Improve incident response by using the latest AI technology from Elastic. Harness the power of the Elastic Security Solution to enable organizations to achieve their use case requirements and improve overall operational efficiency Lead successful project demos that show how Elastic Security can improve an analyst's job. Demonstrate its detection, artificial intelligence, and incident response capabilities to help them make quick, well-informed decisions. Be a thought leader and evangelize our vision: Share your expertise through presentations, blog posts, and other channels, positioning Elastic as the leader in SIEM and Analytics Foster teamwork and information sharing: Create a team in the Solutions Architecture group that collaborates well. Expand this collaboration to other departments. Exchange knowledge and strive for continuous improvement. What You Bring Stay ahead of the curve: Continuously learn and explore the ever-evolving landscapes of SIEM and Analytics and related technologies, ensuring our solutions are always at the forefront of innovation. 15+ years of experience in designing and architecting enterprise-level SIEM solutions. Extensive experience with SIEM platforms and vendor solutions in enterprise environments Deep knowledge of AI and its impact on SIEM and Analytics solutions. Robust knowledge of SOC workflows, processes, and operational challenges. Knowledge of a wide range of security solutions, the type of data they produce and build content to satisfy Use Case Requirements Background that includes forensic analysis, troubleshooting, and threat mitigation. Experience with Behavioral Analytics, and Machine Learning techniques. Ability to work autonomously and thrive in dynamic environments. Proven success in consultative selling, RFI/RFP responses, and customer presentations. Excellent communication and presentation skills, able to engage both technical and non-technical audiences Collaborative spirit and an interest for sharing knowledge and expertise. Unwavering commitment to continuous learning and staying ahead of the curve in technology and emerging trends Certifications such as CISSP, CEH, or GIAC are a plus Willingness to travel 30-50% of the time. Additional Information - We Take Care Of Our People As a distributed company, diversity drives our identity. Whether you’re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life. Your age is only a number. It doesn’t matter if you’re just out of college or your children are; we need you for what you can do. We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do. Competitive pay based on the work you do here and not your previous salary Health coverage for you and your family in many locations Ability to craft your calendar with flexible locations and schedules for many roles Generous number of vacation days each year Increase your impact - We match up to $2000 (or local currency equivalent) for financial donations and service Up to 40 hours each year to use toward volunteer projects you love Embracing parenthood with minimum of 16 weeks of parental leave Different people approach problems differently. We need that. Elastic is an equal opportunity employer and is committed to creating an inclusive culture that celebrates different perspectives, experiences, and backgrounds. Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, pregnancy, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, disability status, or any other basis protected by federal, state or local law, ordinance or regulation. We welcome individuals with disabilities and strive to create an accessible and inclusive experience for all individuals. To request an accommodation during the application or the recruiting process, please email candidate_accessibility@elastic.co. We will reply to your request within 24 business hours of submission. Applicants have rights under Federal Employment Laws, view posters linked below: Family and Medical Leave Act (FMLA) Poster; Pay Transparency Nondiscrimination Provision Poster; Employee Polygraph Protection Act (EPPA) Poster and Know Your Rights (Poster) Elasticsearch develops and distributes encryption software and technology that is subject to U.S. export controls and licensing requirements for individuals who are located in or are nationals of the following sanctioned countries and regions: Belarus, Cuba, Iran, North Korea, Russia, Syria, the Crimea Region of Ukraine, the Donetsk People’s Republic (“DNR”), and the Luhansk People’s Republic (“LNR”). If you are located in or are a national of one of the listed countries or regions, an export license may be required as a condition of your employment in this role. Please note that national origin and/or nationality do not affect eligibility for employment with Elastic. Please see here for our Privacy Statement. Different people approach problems differently. We need that. Elastic is an equal opportunity/affirmative action employer committed to diversity, equity, and inclusion. Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, pregnancy, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, disability status, or any other basis protected by federal, state or local law, ordinance or regulation. We welcome individuals with disabilities and strive to create an accessible and inclusive experience for all individuals. To request an accommodation during the application or the recruiting process, please email candidate_accessibility@elastic.co We will reply to your request within 24 business hours of submission. Applicants have rights under Federal Employment Laws, view posters linked below: Family and Medical Leave Act (FMLA) Poster; Equal Employment Opportunity (EEO) Poster; and Employee Polygraph Protection Act (EPPA) Poster. Please see here for our Privacy Statement. Show more Show less

Posted 2 weeks ago

Apply

20.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Share this job Business Information Hitachi Energy India Development Centre (IDC) is a research and development facility with around 500 R&D engineers, specialists, and experts, who focus on creating and sustaining digital solutions, new products and technology. This includes product integration, testing, cybersecurity, and certification. The India Development Centre is situated in Chennai and Bangalore. IDC collaborates with the R&D and Research centres of Hitachi Energy, which are spread across more than 15 locations throughout 12 countries. In the past 20 years, IDC has secured more than 200 international papers and 150+ patents. Mission Statement We are advancing the world’s energy system to become more sustainable, flexible and secure whilst balancing social, environmental and economic. Hitachi Energy has a proven record and unparalleled installed base in more than 140 countries . Your Responsibilities Self-dependent and structured work Creative innovation driver with strong ownership in IT and OT technologies Able to work in a fuzzy context where different solutions are being evaluated and discussed, establishing structure as needed. Deep understanding of agile and lean product development methodologies. Work experience in power systems environment is a plus. Fair understanding of condition monitoring and asset management. Living Hitachi Energy’s core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your Background Bachelor’s / master’s degree in engineering in Computer Science / Information technology / Electronics and communication / M.Sc. in Substation Automation with documented qualification in IT technologies and micro-service architectures. Proven and validated experience in micro-service architecture development for cloud. Experience in Agile/Scrum/SAFe Agile methodologies. At least 10 years of software development experience. At least 5 years of experience in .NET core Web API and application design and development. Proficient in microservice-based application design and development using .NET Core, Kubernetes, PostgreSQL DB, Azure Service Bus or equivalent. Experience in developing secure cloud native application using Azure PaaS services such as Azure Function App, AKS, Service Bus, Key Vault etc. Familiar with web UI development using any UI framework or library such as React JS or Angular Familiar with Azure DevOps for creating build and deployment pipelines. Knowledgeable in application security aspects such as secret management, cryptography, secure communication for HTTP and WebSocket. Other skills such as certificate management, data encryption, etc. Excellent problem-solving skills and ability to work independently and lead a team. Proficiency in both spoken & written English language is required. Apply now Location Chennai, Tamil Nadu, India Job type Full time Experience Experienced Job function Engineering & Science Contract Regular Publication date 2024-05-21 Reference number R0051128 Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Simeio is a global identity and access management service provider focused on protecting organizations key data and access requirements to business critical systems and applications. Simeio provides services such as Access Management, IGA, PAM & CIAM plus our wider service offerings include support, upgrades, governance and application onboarding. The Opportunity We’re looking for a talented senior .NET developer to join our team and drive the development of scalable, high-performance systems. You’ll focus on .NET backend and API development, with opportunities to collaborate on front-end integration and contribute to secure, enterprise-grade solutions. Responsibilities The Role: Design, build, and maintain scalable applications using .NET (C#/ASP.NET) Develop and enhance RESTful APIs to support business-critical features Work closely with front-end engineers to deliver cohesive user experiences (basic familiarity with front-end frameworks is beneficial) Ensure code quality through unit testing, code reviews, and best practices Collaborate across teams to define technical requirements and deliver solutions that meet security and performance standards Document development processes and contribute to ongoing architectural improvements Core Requirements Strong experience in .NET development (C#, ASP.NET) Proven track record in building and maintaining REST APIs Familiarity with front-end technologies such as JavaScript, HTML/CSS, and frameworks like Angular or React Solid understanding of relational databases and SQL Comfortable working in agile environments with cross-functional teams Preferred Skills Exposure to or experience with RBAC (Role-Based Access Control) systems Awareness of security best practices, including authentication protocols, encryption, and secure API design Experience with identity providers (e.g., OAuth2, OpenID Connect, LDAP) is a plus About Simeio Simeio has over 650 talented employees across the globe. We have offices in USA (Atlanta HQ and Texas), India, Canada, Costa Rica and UK. Founded in 2007, and now backed by private equity company ZMC, Simeio is recognized as a top IAM provider by industry analysts. Alongside Simeio’s identity orchestration tool ‘Simeio IO’ - Simeio also partners with industry leading IAM software vendors to provide access management, identity governance and administration, privileged access management and risk intelligence services across on-premise, cloud, and hybrid technology environments. Simeio provides services to numerous Fortune 1000 companies across all industries including financial services, technology, healthcare, media, retail, public sector, utilities and education. Simeio is an equal opportunity employer. If you require assistance with completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to our recruitment team - [email protected]. Thank you About Your Application We review every application received and will get in touch if your skills and experience match what we’re looking for. If you don’t hear back from us within 10 days, please don’t be too disappointed – we may keep your CV on our database for any future vacancies and we would encourage you to keep an eye on our career opportunities as there may be other suitable roles. Simeio is an equal opportunity employer. If you require assistance with completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to any of the recruitment team at recruitment@simeio.com or +1 404-882-3700. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

3 - 8 Lacs

Hyderābād

On-site

Job Description: Core Responsibilities: This position is responsible for application and system database administration, which includes the development and design of databases that support application and system. Database configuration, performance, reliability, recoverability, and maintaining and upgrading database software and related components. Operational database support for various DBMS software levels, versions, and operating systems. Ensuring availability, performance, integrity, security, and confidentiality of databases, managing backups and recoveries, analyzing and resolving problems, managing disk space, applying patches and upgrades, and working with database vendor support. Developing and implementing best practices and standards, SQL tuning, automation, and project implementation activities. Design, implement, and trouble-shoot scalable and reusable software systems: 3-tier and Microsoft Azure cloud-based systems. Design specifications and effort estimates. Actively support configuration management of code and software. Support detailed documentation of systems and features. Act as liaison between external vendors and internal product, business, engineering, and design teams. Actively participate in coding exercises and peer code reviews as part of the development life cycles and change management. Actively participate in daily stand-up meetings. Requires 3-8 years of experience. Deep technical knowledge and subject matter expert. Skillset for DBA: SQL Programming (SQL queries, stored procedures, functions, and triggers) Proficiency in systems like Oracle, MySQL, PostgreSQL, SQL Server, NoSQL, MongoDB, and DB2 Experience in Relational and Non-Relational DBMS Knowledge in database schema design, normalization, and indexing. Expertise in backup strategies, disaster recovery, and high availability solutions. Skills in optimizing database performance, query tuning, and monitoring. Implementation of security protocols, encryption, and access control measures Creating, implementing, and maintaining disaster recovery plan Familiarity with various OS like Windows, Linux, and Unix. Configuring alerts for proactive management. Proficiency in scripting languages such as Shell, Python, Perl, or PowerShell for automation. Knowledge of Azure Services. DB infrastructure and management services Proficiency in Azure SQL Database, including creation, configuration, and management. Understanding of high availability, backup, and scaling on Azure services. VM management. Networking Configuration and Management Familiarity with command line tools Azure resource monitoring. Familiar with ADO Familiar with Vertical and Horizontal Scaling Proficiency in automating tasks using Azure Automation, PowerShell, and Azure CLI. Experience in writing scripts for routine database tasks and incident response. Setting up and using Azure Monitor, Log Analytics, and Application Insights for monitoring databases. Expertise in tuning performance on Azure SQL Databases and Managed Instances. Use of tools like Query Performance Insight and SQL Analytics. Skills, Knowledge, and Experience: Extensive Full Stack Engineering experience, with an emphasis on frontend & backend programming, ideally a minimum of 3+ years. Strong technical leadership and project delivery including via vendors. Extensive experience, ideally a minimum of 3+ years in the following: Software Design/Architecture. Object-oriented programming experience (e.g., Java, C#, Python, PHP, Perl, etc.). Database concepts: Relational databases (MSSQL, Oracle, MySQL, etc.) and NoSQL databases (Cosmos DB, Mongo DB, etc.). HTML, CSS, JavaScript. SOLID Principles, Design patterns. Web API experience and architectural styles (e.g., REST). Familiarity with unit testing, TDD, and BDD. Modern JavaScript frameworks (e.g., React, Angular 6+). Configuration management experience (e.g., GitHub, Jenkins, Git etc.) Experience in the following areas would be desirable: Microsoft Azure cloud-based technologies. Container technologies (e.g., Docker, etc.). Software methodologies (Waterfall, Scrum, etc.). Azure DevOps a plus. Education Qualifications: Bachelor level degree or equivalent in Computer Science, or related field of study. 3+ years of experience as a Full Stack Developer. Technical or Professional Certification in Domain. Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Job ID R-65755 Date posted 05/02/2025 Benefits Your needs? Met. Your wants? Considered. Take a look at our comprehensive benefits. Paid Time Off Tuition Assistance Insurance Options Discounts Training & Development

Posted 2 weeks ago

Apply

10.0 years

3 - 4 Lacs

Hyderābād

On-site

SMTS - ASIC Security Design Verification Engineer Hyderabad, India Engineering 56869 Job Description WHAT YOU DO AT AMD CHANGES EVERYTHING We care deeply about transforming lives with AMD technology to enrich our industry, our communities, and the world. Our mission is to build great products that accelerate next-generation computing experiences – the building blocks for the data center, artificial intelligence, PCs, gaming and embedded. Underpinning our mission is the AMD culture. We push the limits of innovation to solve the world’s most important challenges. We strive for execution excellence while being direct, humble, collaborative, and inclusive of diverse perspectives. AMD together we advance_ SMTS SILICON DESIGN ENGINEER Lead ASIC Security Design Verification Engineer Position Overview We are seeking a Lead ASIC Security Design Verification Engineer to drive verification strategy and lead a team of engineers. The role involves architecting verification environments, mentoring team members, and ensuring robust security implementations in complex ASIC designs. Essential Responsibilities Lead and manage a team of design verification engineers Define and drive verification methodology and strategy for security-focused ASIC projects Architect advanced verification environments using UVM methodology Review and approve verification plans, test scenarios, and coverage metrics Guide technical decisions for verification infrastructure and framework development Establish best practices for security verification across projects Drive cross-functional collaboration with design, software, and product teams Provide technical leadership in verification reviews and project meetings Mentor and develop team members' technical and professional growth Required Qualifications Bachelor's degree in Engineering, Information Systems, Computer Science, or related field 10+ years of Hardware Engineering experience with at least 5 years in lead verification roles Proven track record of leading complex verification projects and teams Expert-level knowledge in RTL design verification using Verilog/System Verilog/UVM methodology In-depth understanding of security protocols and cryptographic implementations: Symmetric and asymmetric cryptography Public/private key infrastructure Hash functions and random number generators Encryption/signature algorithms (AES, SHA, GMAC) Inline cryptography Advanced programming skills in Verilog, C/C++, Python, and Perl Strong experience in verification planning and coverage-driven verification Exceptional problem-solving and debugging skills Outstanding leadership and communication abilities Preferred Qualifications Master's degree in Electrical Engineering Experience leading security-critical tape-outs Expertise in hardware security architecture and threat modeling Knowledge of formal verification methodologies Experience with verification IP development and reuse strategies Track record of implementing verification process improvements Publications or patents in hardware security or verification Leadership Competencies Proven ability to build and lead high-performing technical teams Excellence in project planning and execution Strong decision-making and problem-solving abilities Effective stakeholder management skills Ability to mentor and develop team members Strategic thinking and innovation mindset Technical Leadership Establish verification standards and methodologies Drive adoption of new verification technologies and tools Lead technical reviews and design discussions Guide architectural decisions for verification environments Contribute to organizational verification strategy #LI-SR4 AMD does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services. AMD and its subsidiaries are equal opportunity, inclusive employers and will consider all applicants without regard to age, ancestry, color, marital status, medical condition, mental or physical disability, national origin, race, religion, political and/or third-party affiliation, sex, pregnancy, sexual orientation, gender identity, military or veteran status, or any other characteristic protected by law. We encourage applications from all qualified candidates and will accommodate applicants’ needs under the respective laws throughout all stages of the recruitment and selection process.

Posted 2 weeks ago

Apply

0 years

4 - 7 Lacs

Hyderābād

On-site

Date: May 30, 2025 Job Requisition Id: 61489 Location: Hyderabad, IN IBG SAP PI Interview questions. 1.Differentiate between PI and CPI?SAP Process Integration (PI)/Process Orchestration (PO):Deployment: On-premise.Focus: Primarily designed for integrating on-premise applications and systems.Scalability: Can be highly scalable, particularly for large SAP landscapes.Complexity: Can be more complex to manage, especially for large or complex projects.Integration: Enables integration between different SAP systems and also with non-SAP systems.Example: Used for integrating different SAP modules like SAP CRM, SAP ECC, and SAP SCM.SAP Cloud Platform Integration (CPI):Deployment:Cloud-based.Focus:Designed for integrating cloud applications, on-premise applications, and third-party systems.Scalability:Offers good scalability and flexibility for integrating cloud-based applications.Ease of Use:CPI is generally considered easier to manage and use, with a more intuitive user interface.Integration:Facilitates integration between cloud applications, on-premise applications, and third-party systems.2.How does SAP PI handle complex scenarios involving multiple systems, asynchronous communication, and various integration patterns (e.g., point-to-point, hub-and-spoke)?Answer: SAP PI utilizes a robust architecture with components like the Integration Engine, Adapter Engine, and Integration Directory to manage complex integrations. The Integration Engine routes messages, the Adapter Engine handles communication with different systems, and the Integration Directory provides a central repository for integration artifacts. Asynchronous communication is supported through message queues, and various integration patterns can be implemented by configuring the Integration Engine and Adapter Engines appropriately.3.What are the key considerations when designing and implementing a complex SAP PI solution, especially concerning scalability, performance, and security?Answer: Key considerations include:Scalability: Ensure the system can handle the expected message volume and system load. This may involve optimizing the Integration Engine and Adapter Engine configurations, as well as using appropriate message queues and infrastructure.Performance: Monitor message processing times and identify bottlenecks. Optimize mappings, message formats, and adapter configurations to improve performance.Security: Implement robust security measures, including user authentication, authorization, and data encryption, to protect sensitive data during transmission and storage. Use secure communication protocols and ensure that the PI system is hardened against security threats.4.How does SAP PI handle data transformations and mappings in complex scenarios involving different data structures and formats?Answer: SAP PI provides powerful mapping capabilities through the Integration Engine, allowing for complex data transformations and mappings. You can use the Integration Engine's mapping tools to transform data between different formats, perform calculations, and enrich data based on business requirements. You can also use external mapping tools or custom mappings to handle more complex scenarios.5.How do you troubleshoot and monitor a complex SAP PI landscape, including identifying and resolving issues related to message failures, performance bottlenecks, and security breaches?Answer: SAP PI provides monitoring tools and business logs to track message flows, identify failures, and monitor performance. You can use these tools to track message status, analyze performance metrics, and identify potential issues. For security breaches, you can use security logs and monitoring tools to detect and respond to security events.6.How does SAP PI integrate with other SAP technologies, such as SAP Business Process Management (BPM) and SAP Cloud Platform Integration (CPI)?Answer: SAP PI can integrate with SAP BPM for process orchestration, allowing you to model and execute complex business processes that span multiple systems. SAP CPI is the next-generation integration platform, and it can be used to build cloud-based integrations and extend the capabilities of SAP PI.7.How Do You Configure an iDoc Collection Scenarios?For Outbound iDoc collection, you need to provide the iDoc package size in Partner Profile (we20) and select the option ‘Collect iDocs’. In PI side, Sender Communication Channel should be configured to handle ‘Multiple iDocs in Same XI message’.8.What Are the Receiver Routing Techniques Available in SAP PI/PO?Standard Receiver Determination and Extended Dynamic Receiver determination are the main methods to define routing in SAP PI interfaces.9.What Are the Different Types of User-Defined Functions?Single Value, All Values in Context, All Values in Queue.10.What Is the Purpose of EDI Separator?EDI separator is an adapter provided with B2B Toolkit. EDI separator splits bulk (batch) EDI messages to individual EDI messages for processing. EDI Seperator supports EDI message format ANSI ASC X12, Edifact, Odette, and VDA.11.What Are the Standard Functions (Objects) Used in Extended Receiver Configuration?Extended Receiver Determination allows us to dynamically derive the message receivers from Message Mapping program. (This is different to routing messages using XPath rules). We need to use several standard objects delivered by SAP under SWCV SAP BASIS and namespace ‘http://sap.com/xi/XI/System’.You need to use standard Service Interface ‘ReceiverDetermination‘, Message Type ‘Receiver‘ to implement Extended Receiver Determination.12.What Are Data Type Enhancements and How Do You Configure Data Type Enhancements?Read my article on Data Type Enhancements.13.How Do you Search for the PI Message of an Inbound iDoc if You Know the iDoc Number?PI message ID can be found in the iDoc Control record. Find the PI message ID in iDoc Archive Key of iDoc control key. Search for the PI message in Message monitor using the message ID.If you have configured iDoc Monitor in SAP PI, you can search for the PI message directly using iDoc number.14.How Do You Configure AS2 Adapter Certificates?AS2 certificates are installed in Key Storage of Netweaver Admin (NWA). Public certificates of PI host and third party systems are exchanged and installed. In PI, keys are installed as a combination of Key Store View and Key Store Entry.15.How Do You Set up a SAP ABAP System in System Land Scape Directory?Usually, SAP technical systems are installed in SLD by BASIS team. You need to create the Product, Software Component Version, and Business System of the SAP system.16.What Is the Functionality of Service Registry?Service Registry is the central location for webservices. Service Registry allows us to expose webservices of PI (host) in accordance with Service Oriented Architecture (SOA). Read how to configure service end points in SR.17.What Is the Purpose of Local Software Component Versions (SWCV) and Their Limitations?Local SWCVs are used to test message mapping programs. Objects in local SWCVs cannot be used in end-to-end integration scenarios or viewed in Integration Directory (ID).18.How Many SWCVs Are Required to Build an Interface with One Sender and One Receiver?I prefer to use Three Tier Architecture to represent an integration. One SWCV for Sender, one for Receiver, and another one for cross-system objects, such as Message Mapping and Operation Mappings.19.What Is the Difference Between Business System and Business Component?Previously known as Business Service, Business Component is an abstract representation of a system in which attributes are unknown or partially known. Business System, on the other hand, represents a known system in SLD, for example, internal systems in the organization landscape. Business systems require underline Technical systems. All SAP systems should be represented as Business systems.20.Highlight a few activities in SAP post-go-live knowledge management.Activities include monitoring system performance, managing message queues, and ensuring data consistency.21.What is an Adapter Engine? Mention the use of Adapter Engine AAE in the SAP PI system.The Adapter Engine handles communication between SAP PI and external systems. The Advanced Adapter Engine (AAE) provides enhanced performance and supports additional adapters.Define/Answer in a sentence:SAP PI/PO is a middleware technology that enables seamless integration and process automation within an organization.22.Explain Synchronous communication under SAP PI. Highlight a few advantages and disadvantages.Synchronous communication involves real-time data exchange where the sender waits for a response. Advantages include immediate feedback and data consistency. Disadvantages include potential delays and system dependency.23.Explain asynchronous communication under SAP PI. Highlight a few advantages and disadvantages.Asynchronous communication involves data exchange without waiting for an immediate response. Advantages include reduced system dependency and improved performance. Disadvantages include potential data inconsistency and delayed feedback.24.What is Global Container & its uses in SAP XI?Global Container is a storage area used to store data that can be accessed across different message mappings and transformations.25.List the various adapters in the Advanced Adapter Engine and Integration Engine in the PI system.Adapters include HTTP, SOAP, JDBC, File, and IDoc adapters. They are used for communication between SAP PI and external systems.26.List down the components you can monitor under Configuration and Monitoring options.Components include message flows, communication channels, and system performance metrics.27.How many SAP sessions can you work for a particular client at a particular time?You can work on up to six SAP sessions for a particular client at a particular time. Key Responsibilities: Design, develop, and maintain SAP PI/PO integration solutions to support business processes.Configure and manage adapters (IDoc, SOAP, REST, JDBC, File, SFTP, etc.) and work with various protocols (XML, JSON, HTTP, FTP, AS2).Develop message mappings using graphical mapping, XSLT, and Java-based mappings.Implement BPM (Business Process Management) and BRM (Business Rules Management) solutions for workflow automation.Troubleshoot and resolve complex integration issues and optimize performance.Ensure compliance with security standards , including encryption techniques and authentication mechanisms.Collaborate with functional and technical teams to understand integration requirements.Work in Agile and DevOps environments , leveraging tools like Jenkins, Git, and CI/CD pipelines for automation.Provide application support and maintenance for existing integrations. Skills Required: Strong experience in SAP PI/PO development and support.Proficiency in Java scripting for developing user-defined functions (UDFs) and adapter modules.Solid understanding of SAP CRM integration and related business processes.Hands-on experience with BPM and BRM for workflow automation.Knowledge of cloud platforms (AWS, Azure, Google Cloud) and SAP cloud integrations.Strong problem-solving skills and the ability to troubleshoot integration issues.Excellent communication and teamwork skills. Preferred Skills: Proficiency in Java, JavaScript, and XML transformations .Experience working with security standards, encryption techniques, and compliance regulations .Familiarity with DevOps tools (Jenkins, Git) and CI/CD practices .SAP certification in SAP PO or a related area is a plus. IBG

Posted 2 weeks ago

Apply

0 years

3 Lacs

Hyderābād

On-site

Job Summary The Security Analyst is responsible for ensuring the security and integrity of the organization's information systems and data. This role involves identifying and mitigating security risks, reviewing project security requirements, and maintaining compliance with security standards. The Security Analyst will also focus on detection engineering by designing systems to detect malicious activities and implementing automation technologies to streamline security operations, including vulnerability management and incident response. General Duties and Responsibilities Information Security Analyst duties and responsibilities include: Identify and ensure mitigation of information security risks within the organization. standards, procedures, and practices across various types of projects. Review requests for adherence to security policies, assuring requests are executed correctly. Identify security incidents and respond to ensure threats and risks are contained. Maintain integrity of security controls, toolsets, and other security-relevant services. Develop and analyze security reports, and build presentations as required. Facilitate status reports and other relevant information to compliance staff and department leadership. Monitor and audit systems for security violations, vulnerabilities, and abnormalities. Develop, implement, and maintain alignment with security control frameworks. Make updates to security policies, standards, procedures, practices, and operating procedures, as required. Assist with incident handling and other incident response activities, as required. Complete and monitor the status of corrective action plans, resolve audit findings and security issues, ensuring problems are resolved in an effective and timely manner. Implement and evaluate the effectiveness of data loss prevention (DLP) policies and detections. Design, build, and fine-tune systems and processes to detect malicious activities or unauthorized behaviors. Implement tools, processes, and procedures to identify unusual or suspicious behavior that may indicate a breach. Create actionable alerts based on detected threats to prompt immediate response from concerned teams. Implement automation technologies to streamline security operations such as vulnerability management, threat detection, and incident response. Use automation to reduce incident response time by enabling swift threat remediation through predefined actions. Educational and Certification Requirements A degree in Cybersecurity, Information Technology, Computer Science, or related field is desirable. Industry recognized certifications are a plus. Certifications may include: CISSP (Certified Information Systems Security Professional), CISM (Certified Information Security Manager), CEH (Certified Ethical Hacker), CompTIA Security+, certifications issued by the SANS Institute, etc. Certifications issued by public cloud providers (AWS, Azure, Google, Oracle, etc.) is a plus. General Knowledge, Skills, and Abilities As well as formal qualifications, an Information Security Analyst should possess: A working level understanding of controls (e.g., access control, auditing, authentication, encryption, and system integrity). Versed in operating systems such as Linux (various distributions) and Microsoft Windows. Experience with Microsoft Active Directory, encryption and algorithms, authorization and authentication mechanisms/software, network monitoring, TCP/IP networks, DNS, next generation firewalls, and intrusion detection/prevention systems. General knowledge of network design and common network protocols, and infrastructure systems. Ability to create scripts to automate processes in PowerShell, Python or Bash is a plus. Ability to recognize and analyze malware. Ability to analyze large data sets and identify patterns and anomalies. Ability to quickly create and deploy countermeasures or mitigations under pressure. Build effective relationships. Develop and use collaborative relationships to facilitate the accomplishment of work goals. Experience with the PCI-DSS, ISO-27001, and/or SOC II compliance frameworks is a plus. Experience implementing and measuring security controls aligned with NIST 800-53 and the Center for Internet Security (CIS) is a plus. Project Management skills is a plus. Experience with the following technologies is a plus: SentinelOne Singularity Platform, Tanium, Google Chronicle SIEM, Cloudflare L3-L7 security technologies, Tenable.io, Lacework, Recorded Future, KnowBe4, ServiceNow, Jira, Microsoft Defender for Endpoints, Microsoft Security and Compliance, Microsoft Azure Key Vault. Experience with the native security service solutions for public cloud service providers (AWS, Google, Azure, Oracle) is a plus. Job Type: Contractual / Temporary Contract length: 6-12 months Pay: From ₹322,415.01 per year Schedule: Day shift Monday to Friday Morning shift Night shift Rotational shift Work Location: In person

Posted 2 weeks ago

Apply

0 years

3 - 4 Lacs

Hyderābād

On-site

Software Verification Engineer Hyderabad, India Engineering 57588 Job Description WHAT YOU DO AT AMD CHANGES EVERYTHING We care deeply about transforming lives with AMD technology to enrich our industry, our communities, and the world. Our mission is to build great products that accelerate next-generation computing experiences – the building blocks for the data center, artificial intelligence, PCs, gaming and embedded. Underpinning our mission is the AMD culture. We push the limits of innovation to solve the world’s most important challenges. We strive for execution excellence while being direct, humble, collaborative, and inclusive of diverse perspectives. AMD together we advance_ The primary responsibility is the validation of BootROM, which includes the following tasks: Develop and execute test cases to validate all boot peripherals from where the FSBL (First Stage Boot Loader) is copied. Example: xSPI, SD, eMMC, UFS, USB Create and execute test cases to validate all proprietary boot sequences. Develop and execute test cases to validate all internal boot modes. Write and run test cases to validate all supported authentication algorithms. Develop and execute test cases to validate all supported encryption/decryption algorithms. Automate tests using Python. Perform testing on prototyping/emulation platforms, including X86 emulation. Identify, document, and track issues using JIRA. Report coverage metrics using tools such as Verdi and add tests to ensure maximum source line coverage. Review requirements and create associated test cases to ensure traceability. Collaborate with different teams to resolve any blockers. Engage in constructive discussions with the design team to improve the quality of the BootROM. Conduct security threat analysis using internal tools. Adhere to safety processes while performing the above tasks. #LI-SK4 AMD does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services. AMD and its subsidiaries are equal opportunity, inclusive employers and will consider all applicants without regard to age, ancestry, color, marital status, medical condition, mental or physical disability, national origin, race, religion, political and/or third-party affiliation, sex, pregnancy, sexual orientation, gender identity, military or veteran status, or any other characteristic protected by law. We encourage applications from all qualified candidates and will accommodate applicants’ needs under the respective laws throughout all stages of the recruitment and selection process.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Vellore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Madurai, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Key Responsibilities: Verify data accuracy and consistency across multiple sources by reviewing customer-provided data for completeness and identifying discrepancies that require correction or clarification. Perform logical data recovery on storage devices impacted by issues such as file system corruption, accidental deletion, or partition damage using software-based recovery techniques. Manage data encryption and decryption processes using standard tools and methods to ensure data confidentiality and compliance with security protocols. Communicate directly with customers to understand their data issues, provide updates on case progress, explain technical processes in simple terms, and ensure customer satisfaction throughout the service cycle. Maintain accurate records of all verification and recovery activities in line with company service standards. Qualifications: Diploma or Degree in Technical field (BCA preferred) Customer-oriented mindset with the ability to manage multiple cases simultaneously. Job Type: Full-time Pay: ₹10,000.00 per month Benefits: Paid sick time Paid time off Schedule: Day shift Work Location: In person

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Faridabad, Haryana, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Conduct regular security assessments, including penetration testing and vulnerability assessments, to identify and remediate potential security risks. Design and implement secure network architectures, including firewalls, intrusion detection/prevention systems, and encryption technologies. Adhere to enterprise governance & review processes to deliver project goals & deliverables. Follow enterprise ITSM & CMDB processes Monitor and analyze security alerts using Security Information and Event Management (SIEM) tools, and respond to security incidents in a timely and effective manner. Collaborate with cross-functional teams to develop and enforce identity and access management (IAM) policies and network access control (NAC) solutions. Stay abreast of the latest security threats, industry trends, and best practices to proactively enhance the organization's network security posture. Ensure compliance with relevant security regulations and standards and assist in audit processes as needed. Essential Experience Bachelors degree in computer science, Information Technology, or related field. Proven experience in network security, with a strong understanding of network security fundamentals, encryption technologies, and secure network design principles. Zscaler Private Access & Internet Access Proficiency in configuring, managing, and troubleshooting firewalls. Cisco, Palo Alto etc,. Experience with intrusion detection/prevention systems, as well as with SIEM tools and security incident response. Excellent problem-solving skills, analytical thinking, and the ability to communicate effectively with diverse stakeholders. CERTIFICATIONS: (any of below) Cisco Certified Network Professional/Expert in Security Zscaler Digital Transformation Administrator Zscaler Zero Trust Certified Associate Cisco Certified CyberOps Associate or Professional Palo Alto Networks Certified Network Security Administrator (PCNSA) Palo Alto Networks Certified Network Security Engineer (PCNSE) Check Point Certified Security Administrator (CCSA) Check Point Certified Security Expert (CCSE) Certified Network Security Professional (e.g., CISSP, CompTIA Security Show more Show less

Posted 2 weeks ago

Apply

7.0 years

6 - 7 Lacs

Chennai

On-site

Job ID R-221158 Date posted 05/29/2025 Job Title: Senior Consultant - SAP PI PO Grade - D2 Introduction to role Are you an SAP Integration professional with hands-on experience in SAP PI/PO/CPI? Do you have a passion for developing SAP integrations using different technologies? If so, we have an exciting opportunity for you! We are looking for a Senior Consultant who is adept at all aspects of the life cycle of integrations. Join us and be part of a team that drives transformational journeys forward. Accountabilities As a Senior Consultant - SAP PI PO, you will: - Take on the role of SAP PI - PO / BTP Developer with both AM & AD perceptions (DevOps) Provide regular BAU support Implement incremental changes and project work Utilize your skills and capabilities to deliver high-quality integration solutions Essential Skills/Experience Minimum 7+ Years of experience in SAP PI/PO Should have handled at least 4 Projects or Support on SAP PI/PO 7.5 Java Stack Shown at least 1 Project or on SAP CPI or SAP BTP Integration Suite Good understanding in BTP cockpit Better understanding of CPI standard processes Good understanding on CPI message Mapping standard methodologies Good working experience in API management open connectors Good knowledge on groovy script java script - PO to CPI migration skills - Working experience on various pallet options Solid grasp on cloud connector Expertise in various SAP PI/PO Tools – NWDS, ESR, ID, RWB and knowledge on SLD Within SAP PI-PO should have worked on various technical adapters like: FTP, SFTP, JDBC, IDoc, RFC, SOAP, REST, HTTP, Proxy, Mail etc. Should have strong expertise in EDI B2B Integration using B2B Addon adapters AS2 & EDI Separator Expertise in Java Mappings & Graphical mappings (including value mappings and lookups) Should have knowledge in handling security artifacts, encryption, and decryption mechanisms Desirable Skills/Experience SAP PI /PO related Java Knowledge Knowledge in SAP API Management Solid grasp in security materials, session handling, authentication methods, set up Be responsible for providing services for application interface production monitoring, job monitoring and making sure system/interfaces are up and running Experience in Certificates / Data Encryption / Data Signing ITSM & SAP SolMan ChaRM experience At AstraZeneca, we are at the forefront of digital transformation, fusing our digital and data capabilities with the support from the business to make it happen. Our team leverages leading technologies and explores data to make improved decisions, helping the business reach the right outcomes quicker. We challenge, innovate, and break away from the norm to find bold new ways of approaching everyday tasks. Empowerment and collaboration are key as we work together to positively impact patients across the world. Our diverse team of specialists continuously expands their knowledge and develops through a two-way feedback loop and novel roles. Ready to shape the future of digital healthcare with us? Apply now! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. SAP PI PO - Senior Consultant Posted date May. 29, 2025 Contract type Full time Job ID R-221158 APPLY NOW Why choose AstraZeneca India? Help push the boundaries of science to deliver life-changing medicines to patients. After 45 years in India, we’re continuing to secure a future where everyone can access affordable, sustainable, innovative healthcare. The part you play in our business will be challenging, yet rewarding, requiring you to use your resilient, collaborative and diplomatic skillsets to make connections. The majority of your work will be field based, and will require you to be highly-organised, planning your monthly schedule, attending meetings and calls, as well as writing up reports. Who do we look for? Calling all tech innovators, ownership takers, challenge seekers and proactive collaborators. At AstraZeneca, breakthroughs born in the lab become transformative medicine for the world's most complex diseases. We empower people like you to push the boundaries of science, challenge convention, and unleash your entrepreneurial spirit. You'll embrace differences and take bold actions to drive the change needed to meet global healthcare and sustainability challenges. Here, diverse minds and bold disruptors can meaningfully impact the future of healthcare using cutting-edge technology. Whether you join us in Bengaluru or Chennai, you can make a tangible impact within a global biopharmaceutical company that invests in your future. Join a talented global team that's powering AstraZeneca to better serve patients every day. Success Profile Ready to make an impact in your career? If you're passionate, growth-orientated and a true team player, we'll help you succeed. Here are some of the skills and capabilities we look for. 0% Tech innovators Make a greater impact through our digitally enabled enterprise. Use your skills in data and technology to transform and optimise our operations, helping us deliver meaningful work that changes lives. 0% Ownership takers If you're a self-aware self-starter who craves autonomy, AstraZeneca provides the perfect environment to take ownership and grow. Here, you'll feel empowered to lead and reach excellence at every level — with unrivalled support when you need it. 0% Challenge seekers Adapting and advancing our progress means constantly challenging the status quo. In this dynamic environment where everything we do has urgency and focus, you'll have the ability to show up, speak up and confidently take smart risks. 0% Proactive collaborators Your unique perspectives make our ambitions and capabilities possible. Our culture of sharing ideas, learning and improving together helps us consistently set the bar higher. As a proactive collaborator, you'll seek out ways to bring people together to achieve their best. Responsibilities Job ID R-221158 Date posted 05/29/2025 Job Title: Senior Consultant - SAP PI PO Grade - D2 Introduction to role Are you an SAP Integration professional with hands-on experience in SAP PI/PO/CPI? Do you have a passion for developing SAP integrations using different technologies? If so, we have an exciting opportunity for you! We are looking for a Senior Consultant who is adept at all aspects of the life cycle of integrations. Join us and be part of a team that drives transformational journeys forward. Accountabilities As a Senior Consultant - SAP PI PO, you will: - Take on the role of SAP PI - PO / BTP Developer with both AM & AD perceptions (DevOps) Provide regular BAU support Implement incremental changes and project work Utilize your skills and capabilities to deliver high-quality integration solutions Essential Skills/Experience Minimum 7+ Years of experience in SAP PI/PO Should have handled at least 4 Projects or Support on SAP PI/PO 7.5 Java Stack Shown at least 1 Project or on SAP CPI or SAP BTP Integration Suite Good understanding in BTP cockpit Better understanding of CPI standard processes Good understanding on CPI message Mapping standard methodologies Good working experience in API management open connectors Good knowledge on groovy script java script - PO to CPI migration skills - Working experience on various pallet options Solid grasp on cloud connector Expertise in various SAP PI/PO Tools – NWDS, ESR, ID, RWB and knowledge on SLD Within SAP PI-PO should have worked on various technical adapters like: FTP, SFTP, JDBC, IDoc, RFC, SOAP, REST, HTTP, Proxy, Mail etc. Should have strong expertise in EDI B2B Integration using B2B Addon adapters AS2 & EDI Separator Expertise in Java Mappings & Graphical mappings (including value mappings and lookups) Should have knowledge in handling security artifacts, encryption, and decryption mechanisms Desirable Skills/Experience SAP PI /PO related Java Knowledge Knowledge in SAP API Management Solid grasp in security materials, session handling, authentication methods, set up Be responsible for providing services for application interface production monitoring, job monitoring and making sure system/interfaces are up and running Experience in Certificates / Data Encryption / Data Signing ITSM & SAP SolMan ChaRM experience At AstraZeneca, we are at the forefront of digital transformation, fusing our digital and data capabilities with the support from the business to make it happen. Our team leverages leading technologies and explores data to make improved decisions, helping the business reach the right outcomes quicker. We challenge, innovate, and break away from the norm to find bold new ways of approaching everyday tasks. Empowerment and collaboration are key as we work together to positively impact patients across the world. Our diverse team of specialists continuously expands their knowledge and develops through a two-way feedback loop and novel roles. Ready to shape the future of digital healthcare with us? Apply now! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. APPLY NOW Explore the local area Take a look at the map to see what’s nearby. Reasons to Join Thomas Mathisen Sales Representative Oslo, Norway Christine Recchio Sales Representative California, United States Stephanie Ling Sales Representative Petaling Jaya, Malaysia What we offer We're driven by our shared values of serving people, society and the planet. Our people make this possible, which is why we prioritise diversity, safety, empowerment and collaboration. Discover what a career at AstraZeneca could mean for you. Lifelong learning Our development opportunities are second to none. You'll have the chance to grow your abilities, skills and knowledge constantly as you accelerate your career. From leadership projects and constructive coaching to overseas talent exchanges and global collaboration programmes, you'll never stand still. Autonomy and reward Experience the power of shaping your career how you want to. We are a high-performing learning organisation with autonomy over how we learn. Make big decisions, learn from your mistakes and continue growing — with performance-based rewards as part of the package. Health and wellbeing An energised work environment is only possible when our people have a healthy work-life balance and are supported for their individual needs. That's why we have a dedicated team to ensure your physical, financial and psychological wellbeing is a top priority. Inclusion and diversity Diversity and inclusion are embedded in everything we do. We're at our best and most creative when drawing on our different views, experiences and strengths. That's why we're committed to creating a workplace where everyone can thrive in a culture of respect, collaboration and innovation.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Cuttack, Odisha, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bhubaneswar, Odisha, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Guwahati, Assam, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

Exploring Encryption Jobs in India

The encryption job market in India is rapidly growing as organizations prioritize securing their data and communication channels. Professionals skilled in encryption techniques are in high demand across various industries such as IT, cybersecurity, finance, and healthcare.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for encryption professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career path in encryption may involve starting as an Encryption Analyst or Cryptographer, progressing to roles such as Security Engineer, Security Architect, and eventually reaching positions like Chief Information Security Officer (CISO) or Encryption Consultant.

Related Skills

In addition to encryption skills, professionals in this field are often expected to have knowledge of cybersecurity, network security, cryptography, programming languages (such as Python, Java), and familiarity with security protocols like SSL/TLS.

Interview Questions

  • What is encryption and why is it important? (basic)
  • Explain the difference between symmetric and asymmetric encryption. (medium)
  • How does a digital signature work in encryption? (medium)
  • What is the role of a key in encryption? (basic)
  • Can you explain the concept of end-to-end encryption? (medium)
  • Describe a scenario where you would use hashing instead of encryption. (advanced)
  • How do you ensure secure key management in encryption processes? (advanced)
  • What is quantum encryption and how does it differ from traditional encryption methods? (advanced)
  • How can you detect if encrypted data has been tampered with? (medium)
  • Discuss the limitations of encryption in ensuring data security. (advanced)
  • Explain the concept of a 'man-in-the-middle' attack in the context of encryption. (medium)
  • How do you mitigate the risks associated with encryption key loss? (advanced)
  • Describe the process of encryption key rotation and its significance. (medium)
  • Can you compare the performance impact of different encryption algorithms? (advanced)
  • How do you handle compliance requirements related to encryption in an organization? (medium)
  • What are some common challenges faced when implementing encryption on a large scale? (advanced)
  • How does encryption contribute to data privacy regulations like GDPR? (medium)
  • Discuss the importance of secure random number generation in encryption. (advanced)
  • How do you ensure the integrity of encrypted data during transmission? (medium)
  • What are the best practices for secure encryption key storage? (advanced)
  • How can you verify the authenticity of an encrypted message? (medium)
  • Discuss the concept of forward secrecy in encryption protocols. (advanced)
  • How do you evaluate the performance impact of encryption on network latency? (medium)
  • Explain the concept of 'salting' in encryption and its purpose. (medium)

Closing Remark

As you prepare for encryption job opportunities in India, remember to showcase your expertise in encryption techniques, cybersecurity knowledge, and related skills during interviews. Stay updated on the latest trends in encryption technology and demonstrate your passion for data security. With the right preparation and confidence, you can excel in the competitive encryption job market in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies