Description - External Software Engineer in Public Cloud Platform Engineering (AWS, GCP, Azure) and work with top engineering talent to build innovative cloud platforms for healthcare applications. This role offers a unique opportunity to engage in analysis, design, coding, engineering, testing, debugging, and more. At our company, every position prioritizes quality in every output. Be part of a team that's transforming healthcare delivery through cutting-edge technology. •Building and operating secure cloud platform capabilities that meet business requirements •Innovate to improving efficiency, reducing technical drag, and create usable app patterns •Practice SRE principles, to eliminating repetitive tasks, monitor performance, simplifying work practices, defining outcomes and metrics, and assure operational quality •Manage security controls at the platform layer to enable the organization to operate securely, efficiently, and within policy •Assist multiple partner teams to understand and apply information security guidance and standards with the goal of mitigating information security risks AI Expectations: Will utilize AI tools and applications to perform tasks, make decisions and enhance their work and overall cloud migration velocity. Expected to use AI Powered software for data analysis, coding, and overall productivity. Qualifications - External - Undergraduate degree or equivalent experience. •5+ years of experience as software development engineer or equivalent hands on experience producing code for production systems •2+ years of experience in Public Cloud such as AWS, Azure, GCP beyond basic IaaS functionality •2+ years of experience programming in at least one high-level language (Python, Golang, JavaScript, etc.) •2+ years of experience with the software build cycle •1+ years of engineering experience in building infrastructure using code and repeatable designs •1+ years of experience in automation of CI/CD using GitHub actions or similar and source control system such as Git •1+ years of experience with Agile/lean development practice •1+ years of experience with Terraform Preferred Qualifications: •Experience with containers and orchestration platforms such as Kubernetes •Algorithms, data structures, OO design and other CSCI concepts •API design and lifecycle management (REST, etc.) •Application foundations and frameworks (Spring, Flask, etc.) •Data storage, caching and optimization (NoSQL databases, Redis, PostgreSQL, etc.) •Inter-service messaging and streams discovery (SQS,Pub/Sub Kafka, etc.) •Instrumentation, logging and tracing (Prometheus, CloudWatch, Stack Driver, Azure Monitor etc) •Evidence based approach to making decisions and solving problems •Demonstrated design mindset - capable of building distributed/scalable services Show more Show less
Primary Responsibilities: Conduct thorough reviews of requirements and systems analysis documents to ensure completeness Participate in the creation and documentation of designs, adhering to established design patterns and standards Independently perform coding and testing tasks, while also assisting and mentoring other engineers as needed Adhere to coding standards and best practices Address and resolve defects promptly Promote and write high-quality code, facilitating automation processes as required Collaborate with the deployment lead and peer developers to complete various project deployment activities Ensure proper use of source control systems Deliver technical artifacts for each project Identify opportunities to fine-tune and optimize applications Mentor team members on standard tools, processes, automation, and general DevOps practices AI Expectations: Will utilize AI tools and applications to perform tasks, make decisions and enhance their work and overall cloud migration velocity. Expected to use AI Powered software for data analysis, coding, and overall productivity Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Qualifications - External Required Qualifications: Bachelor’s degree in computer science or a related field 8+ years of experience working with JavaScript, TypeScript, NodeJS and go lang 4+ years of experience working with Python and Terraform Experience working with Microservices architecture Extensive experience with CI/CD and DevOps initiatives Experience working in an Agile environment Development experience with SDKs and APIs from at least one cloud service provider AWS, GCP, Azure In-depth knowledge of professional software engineering and best practices for the full SDLC, including coding standards, source control management, build processes, and testing Expertise in building and maintaining platform-level services on top of Kubernetes Proficiency in provisioning various infrastructure services in AWS, Azure, and GCP Skilled in provisioning MongoDB Atlas resources using APIs Familiarity with the GitHub toolset, including GitHub Actions Proven excellent debugging and troubleshooting skills Proven exceptional communication and presentation skills Demonstrated positive attitude and self-motivation Show more Show less
Key Responsibilities: 1. Data Analysis: Analyzing structured and unstructured data to derive actionable insights is crucial for informed decision-making and strategic planning. This ensures that the organization can leverage data to drive business outcomes and improve operational efficiency. 2. Data Architecture Design: Collaborating with the Data Engineering team to design and implement data models, pipelines, and storage solutions is essential for creating a robust data infrastructure. Defining and maintaining data architecture standards and best practices ensures consistency and reliability across data systems. Optimizing data systems for performance, scalability, and security is vital to handle growing data volumes and ensure data integrity and protection. 3. Collaboration and Stakeholder Engagement: Working with business units to understand their data needs and align them with architectural solutions ensures that data initiatives are aligned with business goals. Acting as a liaison between technical teams and business users facilitates effective communication and collaboration, ensuring that data solutions meet user requirements. 4. Data Governance and Quality: Implementing data governance practices to ensure data accuracy, consistency, and security is critical for maintaining high data quality standards. Proactively identifying and addressing data quality issues helps prevent data-related problems and ensures reliable data for analysis and reporting. Qualifications - External 1. Education: Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field provides the foundational knowledge required for the role. 2. Experience: 6+ years as a Data Analyst, Data Architect, or similar role ensures that the candidate has the necessary experience to handle complex data tasks and responsibilities. Hands-on experience with data modeling, architecture design, and analytics tools is essential for designing effective data solutions. Proficiency in SQL and data visualization tools enables the candidate to manage and present data effectively. Experience with cloud platforms (e.g., AWS, Azure) is crucial for leveraging modern data infrastructure and services. Familiarity with data warehouse solutions (e.g., Redshift, Snowflake) ensures the candidate can design and manage scalable data storage solutions. Understanding of data governance frameworks and tools is necessary for implementing effective data governance practice Show more Show less