Vaga de parceiro

Integration Specialist - Campinas / SP

Disponível para Assinantes
Salvar nos Favoritos
  • Compartilhe:

Detalhes da Vaga

  • Escolaridade Não Informado
  • Segmento Não Informado
  • Salário Não Informado
  • Área de AtuaçãoDiversos / Outros

O que você irá fazer

  • - Participate in the development and evangelization of the Python coding standards within the organization.
  • - Full responsibility for delivering solutions into production (working through operations teams).
  • - Responsible for training and mentoring developers on the team.
  • - Works with Technical Project Management to create and maintain the prioritized backlog and schedule for the team.
  • - Responsible for architectural decisions with consultation from other members of engineering leadership.
  • - Document every aspect of the project in standard way, for future purposes- Storyboarding and presenting the insights to senior leadership- Leading team of Data Engineers and should be able to guide**Mandatory Requirements**:- 8-12 years in programming (python) and data engineering and design.
  • - Proficient in Restful APIs with Strong Python programming experience.
  • - Experience with managing multiple Python virtual environments (pip, conda).
  • - Architect & implement medium to large scale solutions (end-to-end) on Azure using Azure Data- Platform services (Azure Data Lake, Data Factory, Data bricks).
  • - Experience in developing data engineering solutions in Azure for Enterprise Data Management utilizing Spark, process orchestration, logging and auditing frameworks.
  • - Experience in development and implementation of ETL solutions using PySpark, Azure Data Factory and Azure data bricks.
  • - Experience working with large amounts of real data with SQL (HP Vertica, Synapse, Teradata, Oracle, or MySQL) and Python.
  • - Experience in Airflow (Good to have).
  • - Capable of building insightful visualizations in Power BI, Python (Good to have).
  • - Should have knowledge and experience in working with APIs (REST, SOAP).
  • - Capable of building/developing custom connectors to connect and retrieve information required for databases via.
  • JDBC, ODBC and systems through open source APIs & custom as may require.
  • - Experience with Azure cloud platform.
  • - Should have knowledge and experience in continuous integration and delivery (CI/CD) of Azure data factory pipelines in Azure DevOps, GitHub.
  • - Should have knowledge and experience in SDLC lifecycle, have worked in Agile environment.
  • Propose architectures considering cost/spend in Azure and develop recommendations to right-size data infrastructure.
  • - Should possess tech lead capabilities and manage other DEs and should be able to guide the team and face customer for any queries- ** IT skills required**Languages (Python) (mandatory); good to have knowledge on Java.
  • - Azure (Azure Data Factory, Azure Databricks, Azure Functions, Logic App, Azure Data Lake, Azure- Event hub, Delta Lake house, Azure SQL) Python, PySpark, Spark, Hive, Azure DevOps, Bash- Shell, Powershell, Datawarehouse, NoSQL (CosmosDB))- Visualization tool - Power BI- GitHub, Azure DevOps, Git, pip/conda repositories- Enterprise security patterns (Kerberos, SAML, OAUTH).
  • - Airflow (Good to have)

Informações Adicionais

  • Quantidade de Vagas 1
  • Jornada Não Informado