Tag Archives: Featured
Magama: Intregação de aplicações com chatbot da Amazon Lex
Sobre a Magama
A Magama é uma startup Chilena com 4 anos de mercado que entrega uma experiência digital inovadora. E isso é possível, pois a Magama faz uso de soluções imersivas incríveis, que transportam seus clientes para o mundo da realidade virtual através de tours virtuais em 3D, destinados tanto para eventos quanto atividades relacionadas à engenharia e arquitetura.
O metaverso também é explorado pela Magama. Nesse caso, ela utiliza a inteligência artificial integrada ao mundo virtual e aliada ao chatbot, que funciona na orientação da navegação do usuário. Além disso, o assistente de voz traz diversas funcionalidades para o usuário.
Conectando o mundo do chatbots a realidade virtual
Nesse projeto específico, a Magama queria adicionar um chatbot nas suas soluções para que os usuários finais tivessem uma experiência ainda mais imersiva e fluida. Essa solução habilitaria o usuário, por exemplo, a tirar as suas dúvidas sobre o espaço virtual de forma automatizada.
A Magama identificou a AWS como o seu provedor principal de tecnologias de cloud. E foi com a DNX Brasil que a Magama descobriu o parceiro ideal para tornar a sua visão uma realidade. Um desafio adicional era a necessidade de troca de tecnologias em decorrência de uma descontinuação. No entanto, junto à Magama, modificamos a solução proposta para atender às novas necessidades.
Do ponto de vista técnico, a Magama precisava conectar a sua solução virtual com uma solução de chatbot, além de outros canais, como os de mensageria, por exemplo. Seria necessária, então, uma integração que permitisse conexões entre vários sistemas e os chatbots. E, além da conexão com o chatbot, as métricas analíticas e de controle de qualidade do atendimento dos chatbots também seriam implantadas.
As soluções: API e o dashboard
A nossa solução foi dividida em duas partes. Inicialmente, havia a necessidade de integração de aplicações com qualquer chatbot da Amazon Lex (no nosso caso Lex v2). Para isso, criamos uma API serverless que intermedeia essa comunicação. Com a tecnologia da Amazon, essa integração suporta comunicação tanto via texto quanto usando a voz do usuário. Além de receber uma voz sintetizada do chatbot para permitir casos de usos mais naturais. Amazon API Gateway e Amazon Lambda foram os serviços principais utilizados, além do próprio Amazon Lex.
A segunda parte da nossa solução foi a criação de um dashboard analítico do Amazon Lex. Nesse momento, foi usado Amazon CloudWatch Logs Insights que consome logs nativos do Amazon Lex e visualiza os resultados em um dashboard.
Toda a solução e sua infraestrutura foram escritas em código (IaC) para a sua fácil replicação, modificação e controle. Com isso, atendemos à necessidade da Magama de poder criar vários dashboards para a variedade de seus clientes.
A interação dentro e fora da realidade virtual
A solução entregue é agnóstica, uma vez que é parametrizável o suficiente para integrar qualquer chatbot do Amazon Lex e visualizar as métricas desejadas. Isso viabiliza a finalidade da Magama, que é disponibilizar inovação com chatbots em vários ambientes, dentro e fora da realidade virtual, além da captura de dados relevantes para visualização no dashboard.
Outro benefício do projeto é que a API pode ser disponibilizada para os seus contratantes diretamente. Ao mesmo tempo, a Magama tem controle do uso das APIs, tendo em vista a importância para o controle do custo por usuário ou aplicação.
E, por último, mas não menos importante, mesmo com o desafio dos ajustes no escopo e na ideação, a Magama foi bem atendida por meio de uma solução que permite que ela cresça e se torne mais escalável.
Sobre a DNX Brasil
A DNX Brasil entrega para seus clientes a melhor experiência em cloud computing. Nossas soluções são fundamentadas na nuvem AWS, como: AWS Well-Architected, contêineres ECS, Kubernetes, integração contínua/entrega contínua, service mesh, big data, analytics e inteligência artificial.
Nosso time de especialistas é composto por profissionais experientes, qualificados e certificados pela AWS, com foco em conceitos cloud-native.
Confira nossos projetos de open-source aqui e siga-nos no LinkedIn.
A eficacia de uma líderança depende do uso de dados para tomar decisões importantes, é preciso ter um olhar amplo com informações assertivas para ter ações significativas, assim é contruida uma estratégia de dados moderna para fornecer insights às pessoas e aplicações que precisam, com segurança e em qualquer escala. A DNX Brasil ajuda sua empresa a aplicar análise de dados em seus casos de uso mais críticos para os negócios com soluções completas que precisam de experiência em dados. Descubra o valor dos dados
Galax Pay: Migração para nuvem garante mega investimento para a empresa
Sobre a Galax Pay
Galax Pay é uma plataforma automatizada de gerenciamento de cobranças de cartão de crédito, boletos e pix. Como uma fintech brasileira, a Galax Pay é integrada às operadoras de cartão de crédito para facilitar o processo de cobranças recorrentes. A plataforma ainda oferece acesso a relatórios completos de dados de vendas, gateway de pagamentos para faturas únicas, relatórios customizáveis, gerenciamento automatizado e outras ferramentas que facilitam a gestão de faturamento.
A empresa entendeu que um dos maiores desafios enfrentados pelos empresários brasileiros é a dificuldade de previsibilidade financeira, o que impede investimentos e melhorias em seus negócios. Assim, o sistema de pagamento financeiro Galax Pay foi criado com o objetivo de acabar com esse problema, oferecendo às empresas segurança no recebimento de seus pagamentos mensais.
Em 2015, a inadimplência crescia a uma taxa alarmante em decorrência de uma crise econômica que atingiu o país. Foi então que Márcio Vinícius, atual CEO da Galax Pay, entendeu que era fundamental aprimorar os processos de cobrança e recebimento das empresas. A Galaxy Pay surgiu em um momento em que nenhuma companhia oferecia serviço de pagamento automático de cartão de crédito a um custo acessível para os clientes.
Sobre o sistema
O principal objetivo do Galax Pay é simplificar o gerenciamento de pagamentos através da automação e facilitar os processos de recebimento de pagamentos únicos e recorrentes. Atuando como um intermediário entre bancos, empresas e clientes, a plataforma Galax Pay possibilita que pagamentos sejam efetuados e recebidos por intermédio de vários métodos – incluindo débito direto autorizado e Pix, plataforma gratuita de pagamentos eletrônicos instantâneos administrada pelo Banco Central do Brasil.
A Galax Pay facilita a comunicação das companhias com seus clientes finais, além de oferecer controle total sobre todos os pagamentos por meio de relatórios. Atualmente, a Galax Pay processa mais de R$45 milhões mensais e atende mais de 2.700 clientes.
O Desafio da Empresa
O crescimento inicial da Galax Pay foi lento em decorrência de restrições em sua infraestrutura que estava hospedada on-premise. Problemas diários que a infraestrutura apresentava demandavam quase todo o foco da equipe, reduzindo o tempo disponibilizado para desenvolver a solução.
O time da Galaxy Pay tinha 27 pessoas, e pelo menos 10 delas tinham envolvimento direto com o lançamento dos processos, monitoramento de ambiente e criação de ambiente de teste e validação. Além disso, outros departamentos da empresa operavam com uma equipe muito enxuta, o que resultou na dificuldade de crescimento – pois quando se tem uma estrutura on-premise, quanto mais desenvolvedores são contratados, mais a estrutura tem que crescer para acomodá-los.
A ausência de implantações automatizadas (CI/CD pipelines) e de estratégias de implantação fizeram com que novas versões da aplicação se tornassem amplamente indisponíveis. O repositório estava sendo utilizado indevidamente – os conceitos dos branches de desenvolvimento do GitLab estavam sendo aplicados incorretamente. Na ausência de containers era necessária uma configuração na máquina do desenvolvedor (por aplicação), o que gerou problemas relacionados à disponibilidade no ambiente final. Isso acabou por envolver diretamente os ambientes criados em uma relação de ambiente de desenvolvimento versus ambiente de teste, levando a uma grande necessidade de ambientes de testes e uma grande quantidade de fusões até que uma versão pudesse ser produzida.
Um pacote gerado manualmente foi disponibilizado no servidor, sem nenhum tapete de integração (CI) ou de disponibilidade (CD) e sem nenhuma estratégia de implantação definida – como por exemplo, uma estratégia de implantação verde azul. Ao mesmo tempo, foi liberada uma versão distribuída a todos os clientes.
A maior parte dos lançamentos causou interrupção no serviço para o cliente final, o que pode custar muito caro para a reputação de uma fintech – há uma diminuição da percepção de eficiência e confiabilidade da empresa. Além disso, o próprio uso de repositórios no GitLab e a estratégia de ambientes non-prod também precisavam ser revistos para que a empresa pudesse gerenciar o controle de qualidade por meio do uso de ambientes de teste e aumentar a velocidade dos lançamentos por meio da automação.
A fintech também precisava estar em conformidade com as normas de PCI DSS no setor de pagamentos para atestar o seu comprometimento com o Padrão de Segurança de Dados da Indústria de Pagamento com Cartão. Embora ter um ambiente seguro seja o primeiro passo para obedecer aos padrões de segurança da indústria, o que realmente conta é a capacidade de se manter continuamente em cumprimento dessas regras.
Foi nesse contexto que a Galax Pay procurou a DNX para assessorar na migração de sua estrutura on-prem para a nuvem, algo que possibilitaria o crescimento que a empresa almejava. Através dessa transformação, a DNX influenciou diretamente na habilidade da Galax Pay de atrair investidores e escalar o seu crescimento comercial agregado ao aumento do investimento – resultando em um investimento da CelCoin.
O Processo
- Fase de Avaliação
Através de briefings executivos, a DNX entendeu e catalogou a infraestrutura existente na Galax Pay. Essa etapa exige muita habilidade e é uma parte crítica na jornada de migração. Contudo, ela permitiu que a equipe da DNX não apenas entendesse as dependências e problemas comuns no ambiente, como também estimasse um Custo Total de Propriedade (TCO), aumentando a visão da Galax Pay sobre o seu próprio negócio. Terminada essa fase, a DNX identificou os recursos e aplicações necessárias para realizar a migração.
A DNX também identificou redundâncias e recursos subutilizados, incluindo base de dados que foram replicadas em vários servidores e máquinas compradas para atender demandas de datas específicas – como por exemplo a Black Friday – e que acabavam sem uso pelo restante do ano. A identificação desses custos adicionais ajudou a Galax Pay a tomar decisões que aumentaram as oportunidades de redução de custos e escala.
O principal resultado dessa fase de avaliação foi a criação de um business case de alto nível que desenhou diversas estratégias para que o time atingisse os objetivos do projeto. A análise do negócio possibilitou que a Galax Pay avaliasse todas as opções disponíveis usando suas prioridades e necessidades como parâmetros, o que, em última instância, contribuiu para decisões mais sólidas para o projeto em questão.
Baseada na avaliação dos processos de interação com os clientes, a melhor solução encontrada foi a migração de as aplicações. Os containers disponibilizam uma forma padrão para o armazenamento de configurações, códigos e dependências das aplicações em um único objeto, compartilhando apenas um sistema operacional instalado no servidor. O uso de containers permite que a equipe faça implantações de forma rápida, confiável e consistente, independentemente do ambiente.
Com a evolução do processo de virtualização, os containers são capazes de redimensionar a aplicação rapidamente por precisarem de pouco tempo de inicialização. Esse método simplifica a automatização do processo de implantação – já que a aplicação fica empacotado e pode ser disponibilizado em diferentes ambientes, como o desenvolvimento, homologação e produção.
A DNX concluiu que esse era o melhor método para acompanhar o desenvolvimento da aplicação, já que uma vez feita a conteinerização, há a garantia de que tudo o que a aplicação necessita para operar está intrinsecamente ligada a ela. A estratégia maior era garantir a máxima disponibilidade para o usuário final.
- Fase de Mobilização
Após a avaliação, iniciou-se o processo de planejamento – o momento em que a DNX começou a desenhar a nova arquitetura e o plano de migração de acordo com as necessidades da Galax Pay. A DNX avaliou as lacunas de tempo de resposta da nuvem e interdependência entre aplicações, descobertas na fase anterior. Além disso, foram avaliadas todas as possíveis estratégias de migração para garantir que a mais adequada fosse selecionada e atualizada no business case. Durante a etapa de Mobilização, a equipe da DNX implantou a Citadel, uma infraestrutura na nuvem arquitetada nos padrões de Well-Architected da AWS, pronta para entrar em conformidade com as normas de órgãos reguladores internacionais como PCI DSS, HIPAA, ISO 27001, CDR. E em seguida trabalhou com o cliente para projetar a plataforma da aplicação.
A solução apresentada à Galax Pay foi a de performar a migração através da modernização da aplicação e da utilização de containers utilizando o Amazon ECS, que é executado utilizando o Fargate. O ECS permite a configuração de métricas como CPU, memória e número de conexões, que auxiliam no escalonamento automático. O Fargate foi escolhido para alcançar a elasticidade e agilidade necessárias para a aplicação Galax Pay, pois permite que dois containers sejam executados ao mesmo tempo sem a necessidade de gerenciar servidores ou clusters de instância EC2.
O Fargate simplifica o processo da Galax Pay ao eliminar a necessidade da escolha de um tipo de servidor e o tempo de dimensionamento e de empacotamento de clusters. Outro motivo pelo qual o Fargate foi a escolha perfeita nesse caso foi o atendimento aos critérios de conformidade de PCI exigidos pelo ambiente. O uso do Fargate significa que a Galax Pay não precisará atualizar continuamente o sistema operacional ou utilizar sistemas de anti-vírus para a manutenção da segurança das máquinas.
Antes de iniciar a terceira e última fase do projeto, a DNX concluiu a configuração da zona de aterrissagem utilizando a fundação segura da Citadel e preparando o terreno para a migração de várias aplicações-piloto.
- Fase de Migração
Após a comprovação do sucesso dos aplicações-piloto, começou a migração do restante dos dados da Galax Pay para o ambiente seguro criado na AWS. Para que a Galax Pay se beneficiasse totalmente de tudo que a AWS tem a oferecer, durante o processo de migração o time da DNX realizou uma modernização. Ao modernizar dados e aplicações com conceitos nativos da nuvem, a Galax Pay se preparou para um futuro de sucesso – em que a eficiência de suas operações é otimizada.
Ao replicar o banco de dados, a DNX garantiu a sincronização ativa de dados – o que possibilita que os mesmos sejam replicados no ambiente operacional, reduzindo o downtime para cutover. Ou seja, ir além de uma simples estratégia de levantamento e deslocamento permitiu que a Galax Pay evitasse trazer os problemas do passado para o futuro da empresa.
A Galax Pay entrou em contato com a DNX Solutions do Brasil à procura de uma migração de on-prem para a nuvem, mas a entrega final superou as expectativas. O cliente buscava uma migração lift-and-shift para a AWS, mas entregamos uma modernização completa de acordo com os padrões de qualidade da AWS. A Galax Pay estava ciente dessa solução, mas imaginava que seria algo para o futuro. No entanto, implementamos essa solução nesse momento, evitando que a Galax Pay tivesse que se envolver em um novo projeto mais adiante.
Com o resultado alcançado, a Galax Pay:
- Aumentou a percepção de disponibilidade e performance da aplicação
- Diminuiu o tempo de resposta para melhorias e correção de bugs (bug fixes) e sua efetiva disponibilização. Isso foi refletido no aumento de sua nota na plataforma de avaliação online Reclame Aqui
- Maior segurança para o cliente ao atender os padrões PCI DSS
A modernização da aplicação foi entregue como parte do projeto de migração, aumentando a agilidade e segurança e permitindo que a Galax Pay atingisse metas projetadas para anos no futuro.
Aumento do Investimento e Crescimento
De 2020 a 2022, A Galax Pay cresceu 420% em receita do ano fiscal. Enquanto isso, o número de clientes aumentou aproximadamente em 150%, indo de 1.116 para 2.784 clientes.
Com os desafios operacionais causados por uma estrutura datada resolvidos pela migração efetuada pela DNX, as estratégias de negócio e promoção ganharam destaque. O resultado atraiu o investimento da CelCoin, que atuou como um catalisador financeiro impulsionando os negócios. A fundação segura e dimensionável entregue pela DNX Brasil garantiu que a Galax Pay estivesse preparada para lidar com aumentos de fluxo repentinos.
Estima-se que o aumento de clientes que a Galax Pay alcançou seria atingido em cinco anos, caso eles tivessem mantido sua infraestrutura on-prem.
Aumento de Entregas
Como uma fintech com uma solução digital sendo alimentada por um canal digital de aplicações, tecnologia é o cerne do negócio. O time da DNX implementou a automação de implantação e compartilhou conhecimento com a Galax Pay em relação ao GitLab e ambientes não produtivos. Isso permite a constante entrega de novas versões da aplicação diariamente.
Tranquilidade
Galax Pay agora opera a partir de uma estrutura segura de nuvem, a Citadel, que oferece tranquilidade operacional e de conformidade por meio de maior resiliência, confiabilidade e segurança.
Maior Desenvolvimento
A substituição da atualização manual pela automação otimizou o uso do tempo da equipe. Com as preocupações com a infraestrutura resolvidas, a equipe de desenvolvimento da Galax Pay agora tem tempo disponível para se concentrar nos objetivos principais da empresa e criar novos recursos para a solução.
A automação também permitiu que a Galax Pay implementasse novos recursos em um ritmo que atendesse aos desejos de seus clientes. O controle de qualidade também foi aprimorado por meio da criação de ambientes de teste e produção, permitindo que novos recursos sejam testados antes de serem liberados para o usuário final.
Antes do envolvimento da DNX, a Galax Pay estava restrita a liberar novas funcionalidades manualmente apenas aos finais de semana. Agora, o time tem a flexibilidade de liberar novas funcionalidades de três a quatro vezes por dia.
Conformidade PCI
O ambiente desenvolvido com a solução Citadel permite que a plataforma Galax Pay atinja a conformidade com PCI rapidamente, por esse ambiente ser compatível com PCI em sua construção. A Galax Pay também utilizou o DNX Managed Services, serviço oferecido pela DNX, para coletar evidências para uma empresa externa de auditoria, que confirmou sua conformidade. Isso garantiu a certificação PCI da empresa.
Uso Contínuo de Serviços Gerenciados
Reconhecendo a eficiência do trabalho da DNX ao longo do projeto, a Galax Pay optou por fazer uso contínuo do DNX Managed Services, que vem agregando valor à empresa há mais de um ano.
Atualmente, a DNX fornece um serviço de extensão SRE para a Galax Pay, onde a DNX é a parceira expert da AWS e DevOps da Galax Pay. Ao estabelercer uma parceria de confiança, a Galax Pay não precisa se lançar no mercado de trabalho em busca de mão-de-obra especializada. Isso garante benefícios ao cliente final da Galax Pay, já que o time pode manter o foco no que faz a aplicação rodar melhor – solucionar bugs, implementar melhorias e adicionar novos recursos que facilitam a vida dos das pessoas e empresas que contam com o serviço da Galax Pay.
Confira nossos projetos de open-source em github.com/DNXLabs e siga-nos no LinkedIn, Twitter e Youtube.
Cromai: Treinamento de Deep Learning 15x mais rápido na nuvem
Sobre a Cromai
A Cromai é uma agtech fundada em 2017 com foco em melhorar de forma eficiente a vida do produtor agrícola. E para isso, usa aplicações de tecnologia de fronteira, principalmente, Machine Learning com a visão computacional para identificar de maneira automatizada padrões em imagens coletadas no campo, oferecendo então um diagnóstico que permite uma tomada de decisão mais precisa.
Alinhada à complexidade do campo, a Cromai possibilita que o produtor agrícola atinja seu potencial máximo produtivo utilizando IA de maneira simples e sustentável. É possível utilizar soluções que filtram a impureza vegetal na cana de açúcar a partir de sensores, por exemplo. Para plantas daninhas também é possível identificar o local em que elas nascem e, com isso, direcionar o agricultor para a melhor forma de realizar o manejo necessário.
Esses sistemas processam e analisam fatores que geram resultados para os produtores de todo Brasil. Isso possibilitou um olhar internacional que fez com que a Cromai fosse selecionada pela StartUs Insights como umas das 5 startups mais promissoras no mundo, em visão computacional para agricultura.
Desafios de uma das mais promissoras startups do mundo
O principal desafio era otimizar o tempo de treinamento da Machine Learning, pois a demora para gerar a nova versão era muito grande e impactava diretamente o core do negócio. Trouxemos o treinamento de Machine Learning para cloud AWS, dessa forma foi possível treinar novos diversos modelos, com base em imagens.
Para ter uma dimensão da quantidade de dados para a solução das plantas daninhas, mais de 20 milhões de imagens eram armazenadas no dataset. E esse fator aumentava a necessidade de ter um cluster de treinamento mais robusto. A Cromai utilizava um servidor com uma GPU direcionada ao treinamento de modelos de Deep Learning, e com esta configuração, a realização de experimentos ocorria de maneira demorada, em torno de 3 meses para treinar um modelo.
Os benefícios de treinamento de redes neurais em múltiplas GPUs em paralelo
Entendendo as necessidades da Cromai, o objetivo da nossa solução era a redução do tempo de treinamento sem que isso afetasse, de uma forma significativa, o custo dele e as métricas de performance do modelo. Estávamos confiantes, pois conseguimos entregar um bom resultado, conhecendo as possibilidades do Amazon SageMaker.
Inicialmente, nós tínhamos duas grandes vantagens que contribuíram para o sucesso do projeto. A primeira delas é que na AWS é possível usar instâncias de treinamento muito potentes, equipadas com várias GPUs modernas por instância. Essa alteração tem seus benefícios em termos de pura performance.
Em segundo, é possível distribuir o treinamento em mais de uma instância. Esta tarefa não é algo trivial, já que o treinamento de redes neurais, mesmo sendo distribuído, precisa manter sincronia entre as suas instâncias e GPUs. Para esta tarefa existem frameworks, como SageMaker distributed.
No caso do nosso projeto, devido a uma necessidade técnica, optamos pelo Horovod, framework open-source de treinamento distribuído para algoritmos de deep learning.

O Amazon SageMaker suporta esse framework e a nossa principal tarefa era a adequação do script de treinamento da Cromai para o ambiente do Amazon SageMaker. Utilizamos o S3 como armazenamento de dados de treinamento e, principalmente, adicionamos a camada do Horovod no script de treinamento.
Criamos também uma forma fácil e com transparência de custo para a Cromai escolher a quantidade e o tipo das instâncias de cada treinamento.
Criamos também uma forma fácil e com transparência de custo para a Cromai escolher a quantidade e o tipo das instâncias de cada treinamento.
Redução do tempo de treinamento e o impacto no negócio
Diminuir o tempo de treinamento era fundamental para a escalada dos projetos na Cromai, a demora no tempo do treinamento dos modelos estava afetando diretamente o sucesso do negócio.
Graças ao domínio do nosso time sobre as possibilidades existentes no Amazon SageMaker e a estratégia elaborada, conseguimos de forma efetiva resolver essa dor.
A solução desenvolvida impactou bruscamente o tempo de treinamento que caiu de 3 meses para 6 dias, mantendo todas as métricas de performances existentes. Em caso de necessidade a Cromai tem uma opção de aumentar o investimento no treinamento a fim obter resultados em até 3 dias.
Com a diminuição do tempo a interação ficou mais frequente, isso aumentou a agilidade e o time de tecnologia da Cromai agora passar mais tempo fazendo o que ama: tornar as soluções melhores e mais adequadas à realidade do produtor rural.
Sobre a DNX
Na DNX Brasil trabalhamos para trazer a melhor experiência em Cloud e aplicações para empresas nativas digitais no Brasil.
Atuamos em áreas com foco em AWS, Well-Architected Solutions, Contêineres, ECS, Kubernetes, Integração e Entrega Contínua e Soluções de Mesh e Soluções em Data (plataformas de dados, data lakes, machine learning, analytics e BI).
Confira nossos projetos de open-source em github.com/DNXLabs e siga-nos no LinkedIn, Twitter e Youtube.
Escrito por: Ladislav Vrbsky e Luis Campos / Revisão: Camila Targino
A eficacia de uma líderança depende do uso de dados para tomar decisões importantes, é preciso ter um olhar amplo com informações assertivas para ter ações significativas, assim é contruida uma estratégia de dados moderna para fornecer insights às pessoas e aplicações que precisam, com segurança e em qualquer escala. A DNX Brasil ajuda sua empresa a aplicar análise de dados em seus casos de uso mais críticos para os negócios com soluções completas que precisam de experiência em dados. Descubra o valor dos dados
Saiba o que realmente são as práticas Well-Architected da AWS

Não é de hoje que a computação em nuvem vem revolucionando o mundo. Nesse contexto, com as soluções encontradas, diversas áreas da vida cotidiana estão se transformando.
E no que se refere aos serviços em nuvem (cloud services), há inúmeras possibilidades de uso que podem variar de acordo com os interesses da empresa e/ou startup.
Assim, nesse meio, é comum nos depararmos com os seguintes termos: PaaS, Plataform as a Service, SaaS, Software as a Service, IaaS, Infrastructure as a Service, entre outros. Processos que são bem compreendidos, principalmente, por quem trabalha na área.
Logo, você já ouviu falar na AWS (Amazon Web Services) e nos produtos e serviços oferecidos pela AWS, como o Well-Architected? Caso não, continue a leitura e vamos solucionar este problema agora!
Um dos maiores e melhores serviços em nuvem (cloud) do planeta
A AWS é conhecida mundialmente, pois é a maior empresa de computação em nuvem do mundo possuindo uma oferta de mais de 165 produtos e serviços.
Entre os serviços oferecidos pela AWS, destacam-se: o Storage, Banco de Dados, Computação, Servidores, Machine Learning etc. E em relação a infraestrutura, destacam-se a IaaS, Amazon S3, AWS EC2 e Lambda.
No que se refere a plataforma (PaaS) elencamos ainda, Elastic Beanstalk e Dynamodb, além de diversos softwares (SaaS) que se encontram disponíveis à venda na própria AWS.
Essa gama de produtos e serviços cresce a cada ano, dando muitas opções de soluções para o cliente. Porém, às vezes, o ambiente oferecido não é utilizado da forma mais adequada, e isso pode impactar em questões de segurança, desempenho, custo, infraestrutura, atendimento ao cliente entre outros.
The process of migrating from an on-premise system to the cloud is complex, but when companies hide from the future, they get left behind.
Don’t let your company get stuck in the past, read on to find out what you need to know when considering migration. Our Cloud Migration Checklist will help you craft a well-informed plan to prepare and strategise for your migration to the cloud.
Afinal, do que se trata o Well-Architected?
Para ajudar seus clientes a utilizar da melhor forma possível todos os serviços oferecidos, a AWS fixou seis áreas que foram definidas como os pilares do Well-Architected, que são: Excelência Operacional, Segurança, Confiabilidade, Otimização de Custos, Eficiência e Performance, e Sustentabilidade.
A seguir, apresentamos os seis pilares que dão a base para a Well-Architected:
Excelência Operacional
Esse importante pilar, concentra-se na execução e monitoramento de sistemas e na melhoria contínua de processos e procedimentos.
Os principais tópicos incluem automação de alterações, reação a eventos e definição de padrões para gerenciar as operações diárias.
Segurança
Destaca-se na proteção de informações e sistemas. Os principais tópicos incluem confidencialidade e integridade de dados, gerenciamento de permissões de usuário e estabelecimento de controles para detectar eventos de segurança.
Confiabilidade
Aplica-se nos workloads que executam as funções pretendidas e na recuperação rápida de falhas em atender demandas.
Os principais tópicos desse pilar incluem: projeto de sistemas distribuídos, planejamento de recuperação e requisitos de adaptação a mudanças.
Eficiência de Performance
Concentra-se na alocação estruturada e simplificada de recursos de TI e computação. Os principais tópicos incluem seleção dos tipos e tamanhos certos dos recursos otimizados para os requisitos de workload, monitoramento de performance e manutenção da eficiência à medida que as necessidades comerciais evoluem.
Otimização de Custos
Destaca-se em evitar custos desnecessários. Os principais tópicos incluem: compreensão dos gastos ao longo do tempo e controle da alocação de fundos, seleção do tipo e quantidade certa de recursos e dimensionamento para atender às necessidades de negócios sem gastos excessivos.
Sustentabilidade
Esse pilar concentra-se em minimizar os impactos ambientais da execução de workloads em nuvem.
Assim, os principais tópicos incluem: um modelo de responsabilidade compartilhada para a sustentabilidade, compreensão do impacto e maximização da utilização para minimizar os recursos necessários e reduzir os impactos posteriores.
A AWS confia tanto nos pilares propostos, e leva tão a sério esta arquitetura, que oferece um crédito de U$5.000,00 aos clientes que atualizem o ambiente levando em consideração essas seis áreas. Contudo, para ter acesso a esse crédito, é necessário realizar avaliação e reavaliação com um parceiro AWS acreditado para esse procedimento. E a DNX Brasil é acreditada pela AWS para realizar esse serviço.
E para facilitar que as empresas adquiram este ambiente otimizado, assim como, o crédito fornecido pela AWS, a DNX criou um produto chamado de DNX One Foundation, que você poderá conhecer melhor acompanhando as nossas próximas postagens!
Gostou do conteúdo? Siga as nossas publicações para ficar por dentro de tudo o que acontece no ambiente Cloud. Em caso de dúvida, entre em contato com a DNX Brasil, estamos aqui para te ajudar!
Entre em contato com um especialista da DNX Brasil e reserve uma reunião gratuita de 15 minutos e explore suas possibilidades de Migração para Nuvem
AWS, Azure, or GCP: Which cloud provider is right for you?

AWS, Azure, or GCP: Which cloud provider is right for you?
The Big Three
In modern day cloud computing, three major providers hold the top spots: Google Cloud Platform (GCP), Microsoft Azure, and Amazon Web Services (AWS).
While all three platforms look similar on the surface, with features such as self-service, autoscaling and high level security and compliance, the difference is in the details. Each of these providers vary in their computing capabilities, storage technologies, pricing structures and more.
When migrating to the cloud, the key to success is choosing a provider that matches your unique business goals. In this article, we outline the major differences and provide guidance on how to choose the right cloud provider for you.
Computing Power
GCP is less functionally rich than Azure and AWS, though it offers unique advantages including managing and deploying cloud applications, payable only when code is deployed.
Azure uses a network of virtual machines to offer a full variety of computing services, including app deployment, extensions and more.
AWS computing, termed E2C, is highly flexible, powerful and less costly than other services. E2C provides auto scaling to your usage, so you don’t pay more than necessary. AWS offers a sophisticated range of computing features including speed, optimal security, managing security groups, and much more.
Storage Technologies
Whilst GCP’s storage options are reliable, they remain fairly basic, with features including cloud storage and persistent disk storage.
Azure offers many storage cloud types to target various organisational needs, including Data Lake Storage, Queue Storage and Blob Storage. Additionally, File Storage is optimised for most business requirements.
AWS offers a wide range of storage solutions that allow for a high level of versatility. Simple Storage Service is industry standard, while Storage Gateway offers a more comprehensive storage approach.
Network & Location
GCP does not match the reach of Azure or AWS, currently serving 21 regions with aims to grow its number of data centres around the world.
Azure is available in 54 regions worldwide, keeping traffic within the Azure network for a secure networking solution.
AWS runs on a comprehensive global framework around 22 different regions, including 14 data centres and 114 edge locations. This ensures continuous service, reliable performance, speedy cloud deployment and lightning-fast response times.
Pricing Structure
GCP offers multiple pricing options, from free tier to long-term reservations. These prices are affected by many factors including network, storage and serverless pricing.
Azure charges on a per-second basis, allowing users to start and stop the service, paying only for what they use.
AWS provides a convenient pay-as-you-go model, allowing users to pay only for what they consume, without any termination fees.
Conclusion
AWS is the superior cloud provider in the market, reducing time to value for customers and increasing business agility. With significantly more services than the other providers, AWS offers a greater range of features to its users. For these reasons, among others, DNX Solutions works exclusively with AWS, helping our clients take full advantage of all the benefits it provides. Each of our solutions are designed with AWS in mind, allowing us to focus on getting the most out of the cloud for our clients, today and in the future.
How can DNX help you?
Contact us now to learn more about making the most of the many AWS benefits.
As an AWS partner, DNX offers the guidance and expertise for cloud migrations done right. We offer seamless migration to AWS, following best practice architectural solutions while offering modernisation as a part of the process. With a professional team by your side ensuring security, compliance and best practices, your business will get the most out of this powerful cloud provider.
Contact a DNX expert to book a free 15-minute consultation and explore your possibilities for Cloud Migration
The basics of Cloud Migration

What is Cloud Migration all about?
The concept of cloud migration is familiar to those who use cloud storage in their personal lives. Simply put, cloud migration is the process of moving information from an on-premise source to a cloud computing environment. You can think of it as moving all your important data and programs from your personal computer to a place where they are automatically backed-up and protected. If your computer were to experience a power failure, have hot coffee spilled over it, or be stolen, you would be able to access all of your data from another computer, and have the ability to update your security functions if a breach had occurred. With greater movement of employees and company expansions, storing data in the cloud facilitates business innovation and security, leading to efficiency and ease of governance, preparing you for the digital future.
On a larger scale, cloud migration for businesses includes the migration of data, applications, information, and other business elements. It may involve moving from a local data centre to the cloud, or from one cloud platform to another.
The key benefit is that, through cloud migration, your business can host applications and data in the most effective IT environment possible with flexible infrastructure and the ability to scale. This enhances the cost savings, performance and security of your business over the long term.
Cloud migration is a transformation that will lead the way forward in years to come.
What are the benefits of migrating to the cloud?
The cloud brings agility and flexibility to your business environment. As we move into the world of digital workspaces, cloud migration allows for enhanced innovation opportunities, alongside faster time to delivery.
Businesses will realise all kinds of benefits, including reduced operating costs, simplified IT, improved scalability, and upgraded performance. Meeting compliance for data privacy laws becomes easier, and automation and AI begin to improve the speed and efficiency of your operations. Cloud migration results in optimisation for nearly every part of your business.
What are the options for Cloud Migration?
There are six main methods used to migrate apps and databases to the cloud.
- Rehosting (“Lift-and-shift”). Through this method, the application is moved to the cloud without any changes made to optimise the application for the new environment. This allows for a fast migration, and businesses may choose to optimise later.
- Replatforming (“Lift-tinker-and-shift”). This involves making a few optimisations rather than strictly migrating a legacy database.
- Re-purchasing. This involves purchasing a new product, either by transferring your software licence to an online server or replacing it entirely using SaaS options.
- Re-architecting/Refactoring. This method involves developing the application using cloud-native features. Although initially more complex, this future-focussed method provides the most opportunity for optimisation.
- Retiring. Applications that are no longer required are retired, achieving cost savings and operational efficiencies.
- Retaining. This is a choice to leave certain applications as they are with the potential to revisit them in the future and decide whether they are worth migrating.
How much does it cost?
Migrating to the cloud requires a comprehensive strategy, taking into account multiple management, technology and resource challenges. This means the cost of migration can vary widely, particularly as goals and requirements differ between organisations. Funding options may be available to your business when migrating to AWS, so considering all your options carefully may factor such opportunities into your decision and have an impact on which methodology you choose to follow.
In recent years, technologies and cloud computing companies have been developed to create ease and efficiency in the migration process, such as cloud migration powerhouse DNX Solutions.
How does DNX help you with Cloud Migration?
DNX identifies your unique business needs to uncover the best pathway for you, making your migration journey simpler, faster, and more cost-effective. With a secure, speedy cloud migration process, DNX sets your business up for success from day one.
Using DNX for Cloud Migration means you migrate the right way — and unlock full value from AWS — through a unique, secure, and automated foundation.
DNX makes it easy to migrate to a Well-Architected, compliant AWS environment. As part of the process, DNX modernises your applications so you can leverage the benefits of cloud-native technologies. This means your business will enjoy more resilience, cost efficiency, scalability, security, and availability from the very beginning.
DNX has the solutions and experience you need. Contact us today for a blueprint of your journey towards data engineering.
Bringing cloud native concepts through DevAx to accelerate cloud journey for Big Red Group
DNX Solutions delivered the AWS Developer Acceleration (DevAx) enablement program to Big Red Group (BRG). The program is aimed at increasing the customers’ developer skills for cloud adoption and building developer cloud native fluency across their organisation. A major focus of AWS DevAx is the developer patterns and practices of modernisation and distributed system design, to break down and rearchitect monolithic application architectures.
The DNX team delivered the AWS DevAx enablement as a structured program by running a structured enablement program, working directly with BRG’s development teams for six weeks. A comprehensive curriculum taught through workshops and co-development sessions resulted in the upskilling of BRG’s internal development community.
What is the “Monoliths To Microservices” Program?
The migration from a monolithic architecture to microservices requires both a willingness on the part of the developer and the business as a whole, as well as a thorough understanding of the way in which architectures such as microservices design patterns can be used and the tools that can be utilised in order to deploy them.
The AWS DevAx “Monoliths to Microservices” program aims to increase developers’ knowledge and experience in distributed system design patterns, or to assist developers in gaining more experience in developing on AWS in general. The program takes a theory and patterns-first approach, then introduces the AWS developer tools. It, therefore, targets experienced developers looking to increase their skills, which perfectly reflects the BRG team that undertook the program with DNX Solutions.
Over the 6 weeks that DNX delivered the program, BRG developers started with a Java Springboot Monolith with a large RDBMS backend and methodically broke the monolith into a series of decoupled microservices. The DNX team rehosted the application in AWS, and then refactored the application architecture to utilise application release automation, bounded context based microservices, refactor and rearchitect the databases, implement an event driven system, implement authentication and authorisation systems, and create AI driven services.
Topics like microservices security best practices are covered as a cross-cutting topic across all modules.
- Module 1: Lift & Shift – Migrating The Monolith
- Module 2: Application Release Automation
- Module 3: Create a Microservice
- Module 4: Refactor Your Database
- Module 5: Microservices Decoupled Eventing & Messaging Architectures
- Module 6: Creating an Authenticated Single Page App
- Module 7: Creating Immersive AI Experiences
What is the value of the AWS DevAx program to BRG?
The DevAx enablement contributed to a mindset shift in the BRG Java developers, where they received the knowledge and tools required to alter their way of working from monolithic applications to a microservices-based architecture. This gave them the chance to understand the new technology, the different opportunities it provides and why it is worth adopting. For a company that is dealing with multiple brands all with unique infrastructures and functionalities, merging the data was a mammoth task that required an open-minded and educated developer team. As stated by the BRG Head of Engineering, this complexity is the reason “Devax Academy was extremely important in changing our team’s mindset, encouraging them to get involved with the project”. In addition, the deep understanding and insight into the patterns BRG’s teams need to break the monolithic across different types of architectures at speed will allow developers to reuse those same patterns in the future.
To move from monolith to microservices was a breakthrough for BRG. By moving away from long-running environments and drastically altering the development life cycle, teams can begin doing development with whatever the code repository is, allowing developers to spin up the environments. In addition, the cost of non-production is massively decreased by maintaining production and changing non-production as development is undertaken. In BRG’s case, the new confidence in breaking up and re-architecting monolithic applications that cannot be easily rehosted in the cloud has opened up many more doors, such as making it possible for them to build a secure Infrastructure as a Service (IaaS) that is simple to use and maintain. An additional benefit of microservices is the ability to implement Straight-Through Processing (STP). STP uses automation to increase the speed of financial transactions, which not only simplifies financial processes but its implementation at BRG has also saved them a huge amount in operational expenditure.
Upon completion of the program, the BRG team had gained a thorough foundation of knowledge and insight, meaning they are not only willing but also able, to strive for continual improvement. These benefits are just some of those gained by BRG due to the move from monolith to microservice technology, all of which can be achieved by any business willing to commit to the change.
DNX Solutions values sharing knowledge and is proud to be able to deliver comprehensive programs through the AWS DevAx enablement. For businesses that want to take control of their assets without having to rely on external resources, completing enablement through DevAx is a straightforward and valuable way to increase in-house skills. To see how your business can benefit from this program, contact DNX today.
Big Red Group’s challenge to create a new infrastructure for multiple unique brands
Big Red Group (BRG) is the leading experience partner in Australia and New Zealand.
BRG is the parent company of major experience brands, such as RedBalloon, Adrenaline, Lime&Tonic, and Experience OZ. Each one of them have their unique value proposition to attract and engage diverse audiences, with exclusive distribution channels, B2C and B2B offerings, and unlock access to more than 10,000 experiences across Australia and New Zealand.
The Challenge
After acquiring new brands and inheriting their technology and infrastructure, BRG had to maintain multiple infrastructure sets resulting in the challenge of creating and maintaining new functionalities for each brand. In addition, they had the challenge of providing meaningful reports for the business due to their different data models.
BRG were seeking a cloud consultant partner that could assist them in building a secure infrastructure as a service that was simple to use and maintain from day one. They also sought increasingly leveraging microservices to ensure continuous, agile delivery and flexible deployment of complex, service-oriented applications.
DNX Solutions determined BRG’s business and technical capabilities, such as the interdependencies, storage constraints, release process, and level of security. With the required information at hand and BRG’s required technology, DNX developed a roadmap to meet BRG’s Technical and Business objectives, using AWS best practices “The 7R’s” (retire, retain, relocate, rehost, repurchase, replatform, and refactor).
The Solution
BRG’s project was implemented in two phases where an AWS Foundation, Application Platform (Containers), and Application BluePrints (Static frontEnd and Containers with full CI/CD PIpeline) were delivered.
DNX Well-Architected Foundation entails
- AWS Landing Zones
- 100% infra-as-code
- CI/CD for infrastructure
- CDK in Typescript
- Knowledge transfer
- Cost Report and optimization
- AWS ClientVPN Auditing Strategy
AWS Application Platform
- AWS ECS
- CloudFront + S3 (Static Application)
- Application CI/CD Strategy
- Monitoring strategy
- Auto-scaling strategy
- Logging strategy and retention
- Secrets management
- Application BluePrints
The Outcome
The DNX team designed and implemented a safe infrastructure as a code for AWS Cloud Development Kit (CDK) in typescript to run inside the AWS cloud Formation for their entire foundation as per BRG’s prerequisites.
The typescript was chosen by BRG’s team to provide them with an easier way to write and maintain not just the applications codebase but also infrastructure. TypeScript is a superset of JavaScript which primarily provides optional static typing, classes, and interfaces. One of the big benefits is to enable IDEs to provide a richer environment for spotting common errors as you type the code which BRG’s team was already very familiar with.
It offers all the features of JavaScript, plus an additional layer on top of these – the TypeScript type system. This can help companies to build more robust code, reduce runtime type errors, take advantage of modern features before they are available in JavaScript, and work better with development teams.
DNX also deployed Application Blueprints (Static frontEnd and Containers with full CI/CD Pipeline) so BRG’s team could deploy, migrate, manage and monitor their own applications in the AWS cloud in the future.
As with all of our projects, DNX delivered extensive documentation and sessions on transferring knowledge covering how DNX Foundations works, how to deploy applications, how to run CI/CD pipelines, and more.
Moreover, DNX delivered the AWS Devax Academy training program Monoliths to Microservices for Java developers for six weeks.
Conclusion
No matter your needs or requirements, DNX is able to deliver the right solution for your business.
How DevOps is contributing to CreditorWatch’s Digital Transformation
We live in a Digitally Transformed world where technology allows new forms of work in a rapidly changing environment. Traditional businesses are challenged by start-ups and tech companies with innovative and disrupting business models. New apps and services are created and become obsolete in the blink of an eye.
The traditional development, test, production, and operation models no longer serve our high-speed, connected world, but rather create bottlenecks and friction between departments. Each of the technology areas ends up becoming a silo with strict interaction rules.
On one side of the ring, we have development, trying to answer in the best and fastest way it can through the use of business insights, agile methodologies, and modern architectures and languages. In the other corner, there is IT operations, on a quest for stability and control of the production environments. IT operations is tasked with creating processes and procedures to ensure that every piece of released code is stable enough to avoid incidents, all the while continuing to protect what is already running.
And between them? A huge abyss. This distance separating Development and Operations results in clashes, increasing the time for delivery and problem resolution.
To reduce the friction and allow business ideas to become features to service consumers, the DevOps concept was forged around 2010. It is a concept that continues to grow and, in recent years, has begun changing the IT landscape.
What is DevOps?
DevOps is work culture, bringing software development closer to IT operations, allowing the business as a whole to reap the rewards of collaboration.
DevOps is not a methodology or a tool, but a set of practices built on automation, communication and shared objectives, changing organisational cultures to bring to life a new way to deliver IT. DevOps includes the whole Design, Build and Operate IT lifecycle, unifying these processes with governance and security serving as its basis, sewed up with automation, and an agile way of working.
How is DNX assisting CreditorWatch to evolve and implement a DevOps culture?
All DNX projects use DevOps practices, which provides us the ability to deliver higher quality solutions to clients, with faster and continuous delivery.
Clients are often so impressed by these results that they wish to deliver the same level of quality, knowledge, and efficiency to their own clients.
After completing a successful data modernisation project with DNX, CreditorWatch wanted to continue its digital transformation by implementing a DevOps culture in its IT operations. The DNX professional services team delivered a series of hands-on workshops where developers learned about configuration management, infrastructure as a code, and the whys of the platform. This gives developers the ability to transform into a DevOps team.
The learning curve is decreased considerably through DNX’s pattern and template creation, allowing CreditorWatch’s developer team to recreate their own means to act as a platform.
What is CreditorWatch obtaining with its Digital Transformation?
By adopting DevOps practices, CreditorWatch, represented by its CTO Joseph Vartuli, is building a culture of shared responsibility, transparency, and faster feedback as the foundation of every product and feature developed by its team. This gives them:
- Increased competitive advantage
- Decreased risks
- Decreased costs
- Continuous delivery and deployment
Continuous delivery is an ongoing DevOps practice of building, testing, and delivering improvements to software code and user environments with the help of automated tools. The key outcome of the continuous delivery (CD) paradigm is code that is always in a deployable state.
- Reduced downtime
- Reduced time to market
- Increased employee engagement and satisfaction, through the use of the latest technologies
Adopting a DevOps work culture means different teams within the business collaborate in order to reach a shared goal. Products and services are delivered to your end users at a faster rate with a higher level of quality. As technology becomes integrated with every aspect of our lives, work silos only get in the way. Just like CreditorWatch, you too can benefit from DevOps practices, transporting your business to the future.
The Unique Value DNX brought to the CreditorWatch Project
DNX Solutions utilised its knowledge on DevOps, Cloud, data, and Software Engineering to provide CreditorWatch with a secure environment that continually meets ISO and other compliance standards. The diversity of experience integrated within the DNX team allowed for instant identification of areas for improvement in CreditorWatch’s systems. In addition, DNX assisted CreditorWatch in bringing about a cultural change by transferring its DevOps mindset approach. Not only was the goal of agility and efficiency reached by the close of the project, but significant storage cost reductions were made enabling CreditorWatch to compete to a higher standard and continue to expand.
CreditorWatch Democratises Credit Data
CreditorWatch was founded in 2010 by a small business owner who wanted to create an open source, affordable way for SMBs to access and share credit risk information. Today, CreditorWatch’s subscription-based online platform enables its 55,000+ customers—from sole traders to listed enterprises—to perform credit checks and determine the risk to their businesses. It also offers additional integrated products and services that help customers make responsible, informed credit decisions.
CreditorWatch helps businesses understand who they are trading with and any creditor issues associated with that particular business. They analyse data from 30 different sources, including both private and government sources. Some of their most powerful behaviour data is crowdsourced from their very own customers providing insights into businesses. Ultimately, CreditorWatch customers get access to Australia’s most insightful business credit rating.
The Challenge of Australia’s Largest Commercial Credit Bureau
An expansion phase saw major corporations, including Australia’s Big Four banks, looking to leverage CreditorWatch’s rich dataset and granular analytics capabilities. As a result, CreditorWatch decided to increase its agility and efficiency. With the need to provide a continuously secure and compliant environment, with reduced costs and increased time to market, CreditorWatch engaged with DNX Solutions. DNX was tasked with creating and executing a roadmap for the improvements, targeting cloud-native concepts, and bringing more efficiency to the IT and Operations teams.
Through workshops during the discovery phase, DNX determined CreditorWatch’s business and technical capabilities, such as the interdependencies, storage constraints, release process, and level of security. With the required information at hand, DNX developed a roadmap to meet CreditorWatch’s Technical and Business objectives, using AWS best practices “The 7R’s” (retire, retain, relocate, rehost, repurchase, replatform, and refactor).
A Safe Environment to Meet ISO Standards
To continue delivering a safe platform to their customers and meeting the requirements of ISO and other compliance standards, DNX constructed a new secure AWS environment utilising its DNX.one Foundation.
Rather than undergoing a lengthy and expensive process each time a safe environment needs to be recreated, DNX.one helps customers build secure and scalable container platforms at high-availability and low-cost. This unique marketplace solution designed for AWS with well-architected principles combines years of cloud experience in a platform focused on simplicity, infrastructure-as-code and open sources technologies. In addition, DNX.one provides a consistent approach to implementing designs that will scale CreditorWatch’s application needs over time.
Once CreditorWatch’s environment was secured with the best AWS and industry practices, it was time to move to the modernisation phase.
Instant Cost Reduction of 120K per Year With Data Modernisation
Due to the amount of data received on a daily basis, CreditorWatch’s database increases considerably in size and cost.
The DNX data team worked on the data Engineering by optimising CreditorWatch’s Aurora database and its tools to full capability.
Amazon Aurora is a MySQL and PostgreSQL-compatible relational database built for the cloud that combines the performance and availability of traditional enterprise databases with the simplicity and cost-effectiveness of open source databases.
Amazon Aurora features a distributed, fault-tolerant, self-healing storage system that auto-scales up to 128TB per database instance. It delivers high performance and availability with up to 15 low-latency read replicas, point-in-time recovery, continuous backup to Amazon S3, and replication across three Availability Zones.
Aurora data is stored in the cluster volume, which is a single, virtual volume that uses solid state drives (SSDs). A cluster volume consists of copies of the data across three Availability Zones in a single AWS Region. Because the data is automatically replicated across Availability Zones, customers’ data is highly durable with less possibility of data loss. This replication also ensures that databases are more available during a failover.
The Aurora cluster volume contains all user data, schema objects, and internal metadata, such as the system tables and the binary log. Its volumes automatically grow as the amount of data in the customer’s database increases.
With extensive data knowledge and years of experience with AWS solutions and tools, DNX provided a unique solution to configure Aurora Database leveraging its full capabilities, which resulted in an instant cost reduction of over 90K per year related to instant threshold of data availability.
The DNX team also created an automated archiving process utilising AWS Airflow, which analyses CreditorWatch’s database tables, identifying data which is unused for a period of time. Unused data is then archived with a different type of file storage at a cheaper rate than S3. This process resulted in an additional cost reduction of 30K per year.

The Unique Value DNX brought to the CreditorWatch Project
DNX Solutions utilised its knowledge on DevOps, Cloud, data, and Software Engineering to provide CreditorWatch with a secure environment that continually meets ISO and other compliance standards. The diversity of experience integrated within the DNX team allowed for instant identification of areas for improvement in CreditorWatch’s systems. In addition, DNX assisted CreditorWatch in bringing about a cultural change by transferring its DevOps mindset approach. Not only was the goal of agility and efficiency reached by the close of the project, but significant storage cost reductions were made enabling CreditorWatch to compete to a higher standard and continue to expand.
A eficacia de uma líderança depende do uso de dados para tomar decisões importantes, é preciso ter um olhar amplo com informações assertivas para ter ações significativas, assim é contruida uma estratégia de dados moderna para fornecer insights às pessoas e aplicações que precisam, com segurança e em qualquer escala. A DNX Brasil ajuda sua empresa a aplicar análise de dados em seus casos de uso mais críticos para os negócios com soluções completas que precisam de experiência em dados. Descubra o valor dos dados
Quicksight vs Tableau for Data Analytics. A Comprehensive Comparison
With so many tools available to improve business experiences, it can be difficult to know which will work best for your specific needs. Comparisons between the top competitors can save you significant resources before investing in tool purchases and training your team. Two well-known data analytics tools are Tableau and QuickSight, both of which offer a range of visualisations allowing you and your team to understand your data better. In a world where data is becoming more and more powerful, understanding the story your data tells is absolutely essential for future success.
Whilst all businesses are at different stages of their data modernisation journeys, those who invest in getting ahead now find themselves with a huge advantage over the competition. Data analytics has gone a long way since manually manipulating data in excel, and today a number of simplified platforms are available, meaning you don’t need a team full of data scientists in order to understand what’s going on around you. Tableau, founded in 2003, is now competing with QuickSight, rolled out in 2016. In this article we will comprehensively compare these two analytics tools, so you don’t have to.
Getting Started:
Unlike Tableau’s need for a desktop to create data sources, QuickSight has a range of options for data connectivity. Anyone can start viewing insights on QuickSight despite their level of training, so it allows for the whole team to understand what the data is saying. Tableau is not the easiest tool to navigate with many business users only benefitting from the tool after undertaking training. If you have a diverse team with varying technical knowledge, QuickSight is the right tool for you.
Management:
Tableau has two options for servers, Tableau Online and On-Premises Tableau servers. On-prem servers require dashboards to be developed by analysts and pushed to the server. In addition, they require provision of servers and infrastructure which can be costly to maintain, upgrade and scale. The Tableau Online option has support for a limited number of data sources and is plagued with a history of performance issues. QuickSight, on the other hand, is a cloud-native SaaS application with auto-scaling abilities. Content is browser based, meaning different version usage by clients and servers is inconsequential. In addition, QuickSight’s release cycles allow customers to use new functionality as they emerge with no need to upgrade the BI platform.
Speed and Innovation:
The use of local machines and self-managed servers inhibits Tableau’s ability to perform at great speed and often requires technology upgrades. QuickSight however, produces interactive visualisations in milliseconds thanks to its in-memory optimised engine SPICE. In regards to innovation, despite Tableau’s quarterly release cycle, most users only upgrade annually due to the complexity and costs involved. In contrast, QuickSight users can take advantage of the constant stream of new features as soon as they are released.
Cost and Scalability:
The cost difference between the two tools is so extreme that it is barely worth comparing. Tableau has three pricing options, all of which are required to be paid in full regardless of monthly usage. Tableau’s plans range from $15 to $70 per month. QuickSight is priced on a per-user basis and ranges from $5 to $28 per month. If a user goes a month without logging in, they pay nothing. In the most common scenario, QuickSight is 85% cheaper than Tableau.
The inflexible pricing plans offered by Tableau mean deciding to scale is a difficult call to make. In addition, as the amount of users and data increases so too do the overhead costs of maintaining the BI infrastructure. QuickSight, like all AWS products, is easily scalable and doesn’t require server management. Risk is reduced when experimenting with scaling thanks to QuickSight’s usage-based pricing model.
Security:
Customers utilising Tableau have some difficult decisions to make when it comes to security. Due to the deployment of agents/gateway to connect data on-premises or in Private VPCs, security levels are compromised. QuickSight allows customers to link privately to VPCs and on-premises data, protecting themselves from exposure through the public internet. With automatic back-ups in S3 for 11 9s durability and HA/multi-AZ replication, your data is safe with QuickSight.
Memory:
Tableau’s in-memory data engine Hyper, may be able to handle very large datasets, but it is no match to SPICE. SPICE by QuickSight has a constantly increasing row limit and QuickSight Q offers superior performance when it comes to integrating with RedShift and Athena to analyse large amounts of data in real time.
Sourcing and Preparing Data:
Although the frequency of data being stored on-premises is slowing, some companies are yet to undertake full data modernisation solutions and require access to on-prem locations. Tableau can handle this issue with access to data from sources such as HANA, Oracle, Hadoop/Hive and others. QuickSight, whilst primarily focussed on cloud based sources, also has the ability to connect to on-premises data through AWS Direct Connect. The growing list of databases available to QuickSight includes Teradata, SQL Server, MySQL, PostgreSQL and Oracle (via whitelisting). Tableau allows users to combine multiple data sources in order to prepare data for analysis through complex transformations and cleansing. QuickSight can utilise other AWS tools such as Glue and EMR to guarantee quality treatment of data. Beyond the two mentioned, there are multiple other ETL partners that can be accessed for data cleansing.
Dashboard Functionality and Visualisations:
Tableau has built-in support for Python and R scripting languages and offers a range of visualisation types as well as highly formatted reports and dashboards. QuickSight tends to be more popular in its visualisations, with over a dozen types of charts, plots, maps and tables available. The ease at which data points can be added to any analysis ensures clarity and allows comparisons to be made with the click of a button. Furthermore, machine learning enhances user experience by making suggestions based on the data being considered at the time.
Conclusion:
Whilst Tableau was an extremely innovative tool back when it was founded in 2003, it is no match to QuickSight. With the ability to connect to a full suite of software and platforms available within Amazon Web Services, QuickSight is so much more than a stand-alone tool. For businesses looking for a fast, scalable and easily understood data analytics tool, they cannot go wrong with QuickSight.
With the importance of data growing exponentially, it is no longer realistic to rely on the extensive knowledge of data scientists and analysts for everyday visualisations. QuickSight allows employees throughout the business to gain quick understanding of data points without having to wait for help from analysts. QuickSight is continually releasing new features to make the tool even more user friendly as time goes on.
Data Modernisation solutions offered by DNX frequently utilise QuickSight in order to provide clients with the most cost-effective, scalable and easy to use systems, increasing the power they have over their data.
DNX has the solutions and experience you need. Contact us today for a blueprint of your journey towards data security.
Harnessing the Power of Data in the Financial Sector
Digitisation has enabled technology to transform the financial industry. Advanced analytics, machine learning (ML), artificial intelligence (AI), big data, and the cloud have been embraced by financial companies globally, and the use of this technology brings an abundance of data.
When it comes to FinTech, pace is paramount. The more accurate trends and predictions are, the more positive the outcomes will be. Data-driven decision making is key.
How Data Can Benefit the Financial Industry
Today, FinTech businesses must be data-driven to thrive, which means treating data as an organisational asset. The collection and interpretation of data enable businesses to gain quick and accurate insights, resulting in innovation and informed decision-making.
It is recommended to set up business data in a way that provides easy access to those who need it.
Finance and Big Data
The compilation of globally collected data, known as Big Data, has had fascinating effects on the finance industry. As billions of dollars move each day, Big Data in finance has led to technological innovations, transforming both individual businesses and the financial sector as a whole.
Analysts monitor this data each day as they establish predictions and uncover patterns. In addition, Big Data is continuously transforming the finance industry as we know it by powering advanced technology such as ML, AI, and advanced analytics..
The Influence of ML on the Market
Powered by big data, ML is changing many aspects of the financial industry, such as trade and investments, as it accounts for political and social trends that may affect the stock market, monitored in real-time.
ML powers fraud detection and prevention technologies, reducing security risks and threats. Additiontionally, it provides advances in risk analysis, as investments and loans now rely on this technology.
Despite all the gains made so far, the technologies powered by advanced machine learning continue to evolve.
Security and Data Governance
The cost of data breaches are increasing. In 2021, the financial sector had the second-highest costs due to breaches, behind only healthcare. The technology sector was the fourth most affected, meaning the risk of breaches for FinTech organisations is high.
Data governance is necessary to mitigate risks associated with the industry, which means many companies are required to undergo data modernisation. Businesses must ensure all data is secure and protected and suspicious activity is detected and flagged, in line with strict government standards.
Taking the first steps
The journey to data modernisation offers benefits that far exceed the initial cost of investment, though the process to accreditation can be daunting. The journey begins with building strategies from clear objectives, then mapping the plan, migrating data, implementing cloud tools, and beyond.
To simplify the initial steps towards compliant data modernisation, DNX Solutions has prepared a guide to help FinTech businesses modernise their data. Click here to view the 8 steps you need to take to prepare for your Data Modernisation journey.
DNX has the solutions and experience you need. Contact us today for a blueprint of your journey towards data security.
canibuild Data Modernisation Journey
canibuild
canibuild is a game-changer for the construction industry. After 20 years of facing the same problems over and over again, Timothy Cocaro founded canibuild to take the hassle out of building.
With canibuild, builders and their clients can see what can be constructed on their parcel of land in just minutes, in a virtual, easy-to-understand way. canibuild uses AI-Powered technology to tap into multiple real-time data sources such as high-resolution aerial imagery, local city, and county government data sets, and codification of planning rules – removing the typical “over the fence” site assessment, hand-drawn plans, and estimates. canibuild is customised for each subscriber, with individual floor plans, branding, and costs uploaded onto the platform, allowing subscribers to provide branded plans, flyers, reports, and estimations instantly, condensing outdated practices that would traditionally take weeks. It is a true one-stop-shop where users can instantly site a build, check typography, and request reports to determine build feasibility, site costs, generate site plans, check compliance and produce quotes for homes, pools, granny flats, ADU’s, sheds and more… all in just minutes!
canIbuild is currently available in Australia, New Zealand, Canada and the United States
The Business Challenge
Due to rapid expansion, canibuild required an experienced cloud-native partner to transform its complex cloud platform to sustain and capacitate for their growth by unlocking new data and analytics functionalities. One of the major challenges was to create a Single Source Of Truth (SSOT), which involves integrating different types of data into one central location as opposed to the various data sources from which they were being collected. Among the required data for canibuild is geospatial data, a time-based data that is related to a specific location on the Earth’s surface. This data can provide insights into relationships between variables, revealing patterns and trends.
Delivering DataOps and Data Analytics to Grow the canibuild Business
The DNX team built a platform by implementing a DataOps approach consisting of a collection of technical practices, workflows, cultural norms, and architectural patterns that enable:
- Rapid innovation and experimentation delivering new insights to customers with increasing velocity
- Extremely high data quality and very low error rates
- Collaboration across complex arrays of people, technology, and environments
- Clear measurement, monitoring, and transparency of results
The developed data platform combines modern SaaS ingestion tools (StitchData) and DbT, AWS data services including Data Lake (S3 + Glue Catalog + Athena), Glue ETL, MWAA for orchestration, DMS for near-real-time replication, DynamoDB for control tables and Cloudwatch events for scheduling.

Real-time Assertive Data
After a complex process in which all relevant data were collected, sorted, and stored in one location, canibuild now has real time insights allowing their team to access the same information. The team can now predict future trends, maximise opportunities and work towards realistic goals and objectives to continue growth.
Through our knowledge transfer DNX equipped the canibuild team with knowledge on how to provision a new logical environment for its product:
- Terraform projects
- Terraform variables configuration
- DMS configurations
- Database importer/exporter
- MWAA and how to create new DAGs
- How to troubleshoot Airflow
Data Modernisation Outcome
With the creation of an SSOT and the transfer of all data into a central location, canibuild teams can now access the data they need sooner than ever before, allowing them to respond quickly and efficiently to their clients. Improved data analytics enables them to access real time insights and make more accurate predictions; a valuable asset in current times plagued by uncertainty. Furthermore, thanks to the simplification of the platform by DNX, canibuild’s engineers now have time to spare, allowing them to work on what they do best: producing new features!
To see your business soar towards the future with open arms, contact DNX today and learn how you can benefit from data modernisation.
A eficacia de uma líderança depende do uso de dados para tomar decisões importantes, é preciso ter um olhar amplo com informações assertivas para ter ações significativas, assim é contruida uma estratégia de dados moderna para fornecer insights às pessoas e aplicações que precisam, com segurança e em qualquer escala. A DNX Brasil ajuda sua empresa a aplicar análise de dados em seus casos de uso mais críticos para os negócios com soluções completas que precisam de experiência em dados. Descubra o valor dos dados
Plutora’s Data and Digital Modernisation Journey
About Plutora
Plutora offers value stream management solutions that help companies with release, test environment and analytics solutions for enterprise IT.
Among Plutora’s clients are global organisations typically in healthcare, Fintech and telecommunications, all of which are highly regulated and require tools to maintain compliance. In addition, clients in these industries require predictable software delivery due to high risk tolerance.
The Business Challenge
Although Plutora generates great value to their customers, they were looking for a partner that could assist them in decreasing the complexity of their data infrastructure. They wanted a new architecture based on the best practice of the industry, including automating their processes and modernising their multiple .Net applications due to the approaching end of support. Achieving these goals would allow Plutora to evolve and award them the agility needed to launch new features.
Data and Digital Modernisation Discovery
The DNX Digital and Data team performed a comprehensive Windows and data discovery on Plutora’s workloads which involved a kick-off, followed by a sequence of intense activities. The discovery was concluded with a workshop showcase where the team presented a roadmap stating areas of improvements for the existing solution and a modernisation to be executed afterwards to enable Plutora to achieve its objectives.

Solution
DNX proposed a four phase engagement plan to modernise Plutora’s data & analytics workloads.

In Phase 1, DNX validated the use of temporal tables in SQL Server to enable CDC for the ETL process. This was to improve estimation accuracy for Phase 4.
In phase 2, DNX began delivering early benefits of the modernisation project by using the SQL Server replica DB for the ETL extraction and refactoring the existing SQL Server scripts to extract incremental data only.
This reduced performance impact on the application whilst enabling a higher number of ETL queries to run in parallel, thus reducing the overall time for the ETL execution.
In phase 3, DNX removed the complexity and modernised the ETL platform by implementing Managed Workflows for Apache Airflow (MWAA) to replace the Node App orchestrator, implementing DMS to replicate data between the SQL Server DW and the Postgres DW and Decommissioning of the Node App orchestrator.
In the final phase, the ETL to ELT modernisation was completed.
Data Modernisation outcome
DNX delivered a data modernisation solution to Plutora that began seeing benefits quickly through a number of avenues:
Cost Reduction
Plutora experienced a 30% cost reduction with the Migration of SQL Server to RDS and decommissioned redundant components as well as no cost for utilising Windows licences
Near Real-Time Data
The time for Data to become available for reporting was reduced from 20 minutes to just 4.
Simplicity
Replacing an ELT system built in-house with open source project makes Plutora more attractive to IT personnel and assists in retaining such talents.Further simplicity was achieved through reducing the number of layers on the solution resulting in reduced cost and accelerated delivery. In addition, FTE was reduced to maintain and patch servers and DB.
Evolvability
A number of positive changes can now be enjoyed by Plutora, such as the removal of technical debt and decoupling from vendor and the ability to undertake agile practices due to modern practices within Data & Analytics. The data strategy has created a Single Source of Truth which allows Plutora to benefit from Machine Learning, and the merging of all logic to an application layer reduces time to change and deploy.
Conclusion
With clients who require the most up-to-date technical support, Plutora is in a position where data modernisation is absolutely crucial. With a more simplified and adaptable infrastructure, they are now able to offer the best services to their clients across the globe.
A eficacia de uma líderança depende do uso de dados para tomar decisões importantes, é preciso ter um olhar amplo com informações assertivas para ter ações significativas, assim é contruida uma estratégia de dados moderna para fornecer insights às pessoas e aplicações que precisam, com segurança e em qualquer escala. A DNX Brasil ajuda sua empresa a aplicar análise de dados em seus casos de uso mais críticos para os negócios com soluções completas que precisam de experiência em dados. Descubra o valor dos dados
Payble Accelerates Path to CDR Compliance with DNX Solutions
About Payble
Based in Australia, Payble helps businesses increase their revenue by offering their customers flexible payment options as required. The Payble platform uses open banking to identify consumers who would benefit from flexible payment options and engages them with installment plans or payment extensions.
Navigating the Journey to CDR Compliance
When Australia lawmakers signed the Consumer Data Right (CDR) initiative into law in 2020, financial services firms across the country became eligible for open banking—the practice of giving consumers access to and control over their banking data. However, to receive customer open banking data, banks and other institutions needed to become accredited as a Data Recipient (ADR) by the Australian Competition and Consumer Commission (ACCC), by implementing stringent privacy safeguards and rules to ensure secure protection and management of data. This path to CDR accreditation is complex and time-consuming.
It’s a challenge Payble knows all too well. The Australian fintech uses open banking technology to help customers prevent missed or late payments before they happen. CDR data is a critical component of Payble’s solution. “CDR is incredibly complex, and because it’s new in Australia, there’s no easy method to copy and implement,” says Elliott Donazzan, CEO of Payble. “In addition to specific requirements, there are nuances that don’t apply to the general regulations we’re accustomed to. Plus, a lot of work is required to build the right technology to support everything. CDR is not our core business, so we needed the right partners to help achieve accreditation.”
Collaborating with AWS Partners to Solve the CDR Challenge
Payble has been running on the Amazon Web Services (AWS) Cloud since the company’s inception, using a range of AWS services to support its application environment. Through its relationship with AWS, Payble was introduced to a group of AWS Partners that specialize in accelerating the financial technology industry’s CDR accreditation and technology solutions. This network of partners includes DNX Solutions, an AWS Advanced Consulting Partner; AssuranceLab, a modern assurance firm that provides accreditations for CDR and global standards; Astero, a cybersecurity company specializing in open banking and CDR; and Adatree, a proprietary, AWS-built CDR Platform for Data Recipients. “We had conversations with Adatree and began sharing engineering strategies,” says Helder Klemp, CEO of DNX. “After discussing with AWS about some of the other partners that we could work with, we decided to jointly develop a solution to help businesses become accredited.”
Developing CDR in a Box Solution
The partners created CDR in a Box, an AWS-based, compliant CDR platform. The modular platform is based on the AWS Well-Architected Framework and features core AWS security components including AWS Security Hub, Amazon GuardDuty, AWS Identity and Access Management (IAM), and AWS Key Management Service (KMS).
CDR in a Box includes the ADR Accelerator, a business solution jointly developed by Adatree and Astero. The template-based solution is designed to help enterprises accelerate their Accredited Data Recipient (ADR) application, a key part of CDR compliance. An accredited data recipient is a business that has been accredited by the ACCC to receive data from a data holder.
Adatree’s platform is built on AWS and runs on a range of AWS services.
Astero used its cybersecurity expertise to support CDR in a Box with security solutions including technical security documentation and controls assessment services required for accreditation. “CDR in a Box ensures customers follow a security and risk-first approach to compliance,” says Sandeep Kumar, CEO of Astero. “This starts by helping customers define the boundary and data flows of their CDR data environment, performing threat assessments, and implementing appropriate security controls.”
AssuranceLab contributed to CDR in a Box by using its accreditation expertise and skillset to build the required technical security documentation for CDR audits. “As a group, we brought four expert offerings together into one seamless solution for Payble,” says Paul Wenham, CEO of AssuranceLab. “By understanding each other’s approach, and working effectively together, it removed the guesswork and business disruption for Payble to focus on what they do best.”
Payble used the ADR Accelerator to provide the business readiness documentation for the company’s ADR application audit. DNX also supported Payble throughout the auditing process, offering automated compliance capabilities. The overall combined partner offering includes guidance and support specifically tailored to Payble’s business.
Building Audit-Ready CDR Environment in 4 Weeks
Because the AWS partners worked together to build a well-architected AWS solution for ADR applicants, Payble gained an audit-ready environment and a completed audit, in four weeks. AssuranceLab carried out the audit, in parallel to the implementation activities before the audit took place.
Payble also took advantage of ADR Accelerator to provide business readiness documentation for the company’s ADR application six months faster than the normal timeframe for accreditation. “The sentiment in the industry in Australia is that the CDR is too hard to get into because of cost and time commitments, but startups need it in order to provide something compelling to market,” Kumar says. “We’re trying to make CDR access simpler while still meeting all the compliance requirements.”
“As a startup, we need to move quickly and access the benefits of CDR compliance as fast as possible,” says Donazzan. “By working with AWS partners to complete the ADR application process faster than we could have by ourselves, we can focus on our core business instead of the accreditation process.” In addition to accreditation, Payble benefits from having a strong security and compliance foundation for its business, built by DNX based on AWS Well-Architected principles.
Eliminating the Need to Hire Specialized Staff
Payble reduced the need to hire specialized internal audit staff due to the AWS partners’ combined controls assessment, technology, documentation, and security services. “We only have one point person to work on compliance issues, and the AWS partner solution helped us avoid hiring more people to work on the accreditation process,” says Donazzan.
The solution has streamlined the engagement between Payble and compliance auditors. “Becoming CDR compliant is important, but startups don’t necessarily have the resources to hire a fulltime security compliance person or expensive engineers,” says Kumar. “By using our solution’s automation, Payble did not have to begin with a blank sheet and try to understand CDR rules and create security policies. They could move quickly on the entire process.”
Cuts Accreditation Costs by 50%
Rather than investing time and money into hiring a specialized compliance professional, learning everything required for CDR, and preparing all the documentation, Payble streamlined the entire process via the CDR in a Box solution on AWS. “We were considering a compliance solution that would’ve cost twice as much as the AWS partner option,” says Donnazan. Overall, Payble spent less than $90,000 on infrastructure, documentation, and audit costs. “In the financial services industry, complete compliance solutions can cost many, many times more than that,” says Kumar.
As of November 2021, Payble received accreditation as an unrestricted Data Recipient. Weeks later, it reached Active Status through Adatree’s platform. This required passing of technical conformance tests to ensure compliance with the rigorous technical standards. This the only business in Australia to reach this status through an intermediary.
The four AWS partners are continuing to work alongside Payble. “Audits can be complex and painful, but as a team, we worked together to simplify the process,” says Kumar. “Our relationship with Payble will continue into the future.”
Benefits
- Builds audit-ready CDR environment in 4 weeks
- Provides ADR application documentation 6 months faster than industry average
- Eliminates the need to hire specialized staff
- Reduces accreditation costs by 50 percent
AWS services used
Reinventing myDNA Business with Data Analytics
About myDNA
myDNA is a health tech company bringing technology to healthcare with a mission to improve health worldwide. They developed a personalised wellness myDNA test that lets you discover how your body is likely to respond to food, exercise, sleep, vitamins, medications, and more, according to your genome.
It is a life changer for those who want to skip the lengthy trial and error process, and achieve their desired fitness goals sooner. Moreover, myDna is a reliable way of assisting practitioners in selecting safe and effective medications for their patients based on their unique genetic makeup. For example, doctors can prescribe antidepressants, and post-surgery pain killers that are more likely to be successful in the first instance.
The most exciting part is that this technology, which has historically been so expensive, is now available at an affordable price for normal people like you and me! Not to mention, finding out you have relatives on the other side of the world through a family matching DNA test is pretty cool!
Providing life health services based on accurate data
After replatforming myDNA IT systems from a distributed monolithic database to a microservice architecture, the team needed assistance in delivering automated tools and meaningful insights through the business. This would give them an understanding of potential areas and markets to expand their services, the agility to move and change fast as a business and, provide an advantage over competitors by delivering the services, products, and customer experience their customers seek. This is all based on data rather than assumptions.
myDNA was seeking a cloud consultant that could assist them in exploring and understanding events by expanding their data and analytics capabilities. In addition, the business planned to increase their data skills so their in-house IT team would be able to maintain and continue building the new applications in a safe and effective environment.
AWS performed a Data Lab with myDNA stakeholders where they co-designed a technical architecture and built a Pilot to start the journey. This gave the myDNA team an understanding of all the AWS cloud data and analytics solutions available. However, they required a personalised and well-designed technology roadmap taking their IT skills and myDNA business goals into consideration, as opposed to a ‘one solution fits all’ strategy. This is exactly what DNX Solutions delivered!
How did DNX Solutions help myDNA establish a modern security data strategy in just one month?
The project started with DNX’s effective and interactive discovery where our team identified the company’s needs, had a complete picture of the existing company data, the architecture used, potential technological and/or team challenges. With that, our team created a clear road map where outcomes were evident even before the conclusion of the project

In the initial phase, DNX built the MVP using AWS Console, more general roles, data sources, and built simple reports and dashboards to present basic metrics.
After that, our data cloud experts built a more robust solution fit for production, with a focus on resilience, performance, reliability, security and cost optimisation using Devops methodology, CI/CD pipelines, automation and serverless architecture whenever possible.
Once the core platform was established, we brought more data sources, integrating them into the solution, and helped to build more complex and advanced solutions such as Machine Learning.
AWS Services Used
S3 Datalakes
Raw: hosts the data extracted allowing governance, auditability, durability and security controls
DynamoDB / SSM
Stores configuration tables, parameters, and secrets used by the pipeline and ETL Jobs to automate the data process
Crawlers
Crawlers can scan the files in the datalake or databases, infer the schema and add the tables on the data catalogues
Glue ETL
Serverless Spark solution for high performance ETL jobs within AWS
Data Catalogues
Stores the metadata and metrics regarding Databases, Connections, Jobs, partitions, etc. It can grant/deny access up to the table level
Quicksight
Can consume data from multiple sources within AWS and allow user-friendly development of reports, analytics and dashboards integrated with AWS platform
Lake Formation
Low code solution to govern and administer the Data Lake. An additional layer of security including row/column level controls
Lambdas
Wild cards that can help tie the solution together in a variety of roles and use cases
Athena
Athena can query data stored in S3 using simple SQL. It allows access segregation to metadata and history via workgroups, which can be compounded with IAM roles
myDNA to provide real insights at the click of a button
There is no doubt that DNX Solutions delivered value to myDNA. The team reported they were able to deliver another data transformation that depended directly on the result of DNX’s work.
Before engagement with DNX, the myDNA team could take three to five days to deliver a few manual reports in response to business queries. The company now is able to deliver different reports based on live data with just a click of a button. Not only does the business have accurate insightful data to make their decision of what, when, and where they should invest, but they also have the agility to make these decisions.
The myDNA team can now focus on what they do best rather than spending days merging unreliable information from various sources to produce a handful of outdated reports.
The next step for myDNA is to adopt AWS machine learning to unveil predictions, achieving far better real-world results.
A eficacia de uma líderança depende do uso de dados para tomar decisões importantes, é preciso ter um olhar amplo com informações assertivas para ter ações significativas, assim é contruida uma estratégia de dados moderna para fornecer insights às pessoas e aplicações que precisam, com segurança e em qualquer escala. A DNX Brasil ajuda sua empresa a aplicar análise de dados em seus casos de uso mais críticos para os negócios com soluções completas que precisam de experiência em dados. Descubra o valor dos dados
How to Attract and Retain IT Personnel
Attract and Retain IT Personnel
Finding and retaining IT personnel can be challenging. Tech companies are the new black, and everyone is always on the lookout for the next big thing. The tech industry is constantly changing, meaning you not only need an employee who is competent and has the right skills for the job, but you also need someone adaptable. On top of a very specific skill set, you’re searching for the right fit for your team. Often, after a long but successful search, your IT personnel up and leave as they get a better offer. Now you are back at square one. If you’re not in Silicon Valley you may feel as though the best talents are passing you by, so how can you make your company more attractive to IT personnel, and furthermore, how can you keep them interested? Read on to learn what attracts and retains talent in tech.
First and foremost, technology professionals care about technology
The majority of people who choose technology as a profession, do so because they love it. IT professionals are passionate about their work and they are looking for ways to advance technology usage and types. Passion results in high levels of knowledge and curious minds that never stop researching. For this reason, IT personnel want to know what they will be working with, and how the company will react to new technologies and software as they are developed. By having a detailed technology roadmap in place you can entice IT personnel to take an interest in your business. A roadmap that is up-to-date, data-driven and forward-facing is what will catch the eye of professionals. If your software is behind the times you would benefit from planning to modernise your data. Outdated technology is difficult to upgrade and unable to meet modern day standards. If you are running an old version of .NET or Java, for example, you are unlikely to attract the IT professionals of the future. There is nothing more unattractive than a tech company plagued by inertia. By modernising your data and having a solid roadmap in place you can show the tech community that you are heading in the right direction. It isn’t too late, but if you don’t make the move soon, it may be. Aside from general enquiries, IT professionals may come to interviews with specific questions, and the more specific you can be when answering the more they will know you care about technology too.
Who is interviewing who?
IT personnel face no shortage of job opportunities. When interviewing someone for a tech position in your company, you may see the tables turn and find yourself on the receiving end. Preparing answers to the questions interviewees are likely to ask will give them faith in you and your business. Here are a few questions that an experience IT professional may throw your way:
- What’s your current tech stack?
- What are your policies on updating and using current and modern technologies?
- How do you keep your technology updated?
- How do you release new versions?
- How do you adopt new versions?
- How do you test new possibilities?
Be specific. Ensure you have someone knowledgeable on the panel who can answer these questions with confidence. Having the CTO available to outline the roadmap and dive deep into the softwares used, may win over the candidate. In addition, by letting it be known which softwares and programs you use, you may attract more tech talents who like working with that particular technology.
Catching it and keeping it are two different things.
So having an up-to-date roadmap and modernised data is a way of attracting tech talent into your business, but how do you hold on to them with the ever-present threat of tech giants peeking over your shoulders?
IT professionals are some of the most innovative minds of our times. They like to stay stimulated and they like to move forward. If you want to retain IT personnel, you have to make sure they are being rewarded with more than just a good salary. Empower your employees by embracing a learning environment: invest in education and hands-on training opportunities. Give employees the option of focussing on what interests them and play to their strengths. If an employee is keen to study machine learning, find out if there is room for machine learning in your business and implement it. This way not only are you supporting the growth of your employee but you will likely benefit from what they learn. In addition, consider including your IT personnel in the development or revision of your technology roadmap. Put them on the team and incorporate their insights, allowing them to see that their inputs are valued. Professionals are more likely to stay on a project where they feel they have some ownership. Professionals who are new to your team are also likely to have an idea of what competitors are doing, which is important to know. Using tools such as Tech Radar provides insight into which technology the community is currently excited about and what is on its way out.
We can forecast, but we’re not fortune tellers!
It is true that technology can be unpredictable. There are plenty of examples in recent history where hindsight has taught us a thing or two. Remember when Blockbuster laughed in Netflix’s face at the suggestion of buying them out? Um, does anyone even remember Blockbuster at all? We rest our case: technology can be tricky. There is always a gamble in the future of tech, and not every business is going to get it right. There are entire organisations that can crash simply because of a new technology that disrupted the industry and made certain products or services obsolete. The important thing is to always be prepared as you can be, be agile and flexible. Value the input of your IT professionals and be willing to consider all options. Don’t walk among the dinosaurs, soar among the stars.
Need a technology professional, but don’t work in a technology company? We have news for you.
Technology companies are no longer restricted to technology companies. What? Let us explain. Just because your company is not categorised as being in the technology industry does not mean you are exempt from needing a technological roadmap and structured tech activities. In this day and age, technology is integral to everything we do. The agriculture industry utilises IoT devices and drones undertaking recognition via GPS; the energy industry provides homes with smart meters showing real-time measurements; even the CEO of General Motors referred to GM as a software company for cars back in 2013. If you need to hire an IT professional, you need to consider yourself a technology company.
Know your target.
In conclusion, to attract and retain IT personnel, you need to know what they want. You must understand their desire for advanced technology, a culture of agility, and a learning environment, and then you must implement it. Make your company a place where people can grow so they don’t feel the urge to find growth elsewhere.
DNX has the solutions and experience you need. Contact us today for a blueprint of your journey towards data security.
Migrating from Azure to AWS
About Ferret
Ferret is a company providing a Relationship Intelligence solution. It collects data from tens of thousands of sources, uses AI and machine learning to classify this data, and allows its customers to have access to exclusive databases, such as historical and real-time negative news, politically exposed people, papers and leaks, illegal activity, social media sentiment analysis, and more.
The Business Challenge
Ferret started developing and deploying their Relationship Intelligence solution on Microsoft Azure however they faced major issues being the most critical issues were related to training their ML Model’s as Azure infrastructure could not scale to provide the capacity they required. They also faced a lot of challenges to import the large datasets from on-premise to Azure and could not successfully import the Production DB using their tool of choice (Pentaho Kettle) or Azure Data Factory as the data, application, and its’ DevOps team worked in silos and were not able to solve data migration issues end-to-end.
The suggested solution to split their workloads between different Azure data centers produced a significant delay and issues with data integrity. Moreover, providing a big impact on their confidence to deliver and launch their solution on time; compromising their business operations.
Moreover, the fees of 80 K per year for only the test environment have already superseded their expectations.
Ferret understand the benefits of being in the cloud and, most of them align with their company’s needs. Therefore, after not obtaining the desired cloud outcomes at Azure, the company decided to invest in migrating its workload to AWS.
Ferret required a partner with extensive knowledge and expertise with cloud and AWS solutions to overcome the issues with their complex infrastructure to assist them deliver a stable and reliable application by their deadline on a more cost-effective basis.
The Assessment Phase
DNX Solutions was engaged to create and execute a roadmap for Ferret’s migration from Azure to AWS ensuring the company obtained the AWS Migration values of cost-saving, staff productivity, operational resilience along fast time to market.
During the discovery phase, utilising AWS Migration Readiness Assessment ( MRA) and workshops, DNX determined what was Ferret’s business and technical capabilities ( the interdependencies, storage constraints, release process and level of security) .

With all necessary information and, based on AWS “The 7R’s” best practices ( retain, retire, relocate, rehost, repurchase, replatform and refactor) DNX developed and proposed the following migration plan:

The Mobilise Phase

Migration Strategy

Due to Ferret’s implementation time constraint and focus on fast delivery DNX designed an AWS platform that relies on the following migration strategies:
- Re-hosting (lift-and-shift)
- Re-platforming
The main focus was to do a re-hosting of all the relevant services from Azure to AWS but to do a re-platforming of specific services based on AWS managed services which can provide better control over the solution while reducing maintenance overhead and delivery time.
With this statement in mind, DNX Solutions followed the following design principles:
- Kubernetes only for Stateless apps.
- AWS managed services over self-maintained services.
- Bamboo for mobile app and Bitbucket Pipelines + Argo CD for cloud apps
- Az Blob Storage to AWS S3
- Az Cosmos DB to Atlas MongoDB
- k8s Kafka/Zookeeper to AWS MSK (Managed Service for Kafka)
- k8s Elasticsearch/Kibana to AWS Elasticsearch
- k8s neo4j to neo4j on AWS EC2
Az AKS to AWS EKS
The Solution
Cloud Foundation
The project started with the implementation of our Cloud Foundation which combines years of cloud experience in a platform built with simplicity in-mind, infrastructure-as-code, open-source technologies, and is designed for AWS with well-architected principles.

Application Modernisation

The second phase involved migrating applications running on Azure Kubernetes Service to the AWS Managed Kubernetes Service (aka EKS). At Ferret, they used to have StatefulSets running and hosting their data solutions, such as Elasticsearch, Kafka, etc. These services were replatformed to AWS Managed services so that we could offload the work of maintaining these pieces of infrastructure.
Next, we migrated the Kubernetes stateless apps to EKS. One of the first choices was to use Spot Instances, which can be up to 70% cheaper than regular instances. We also used our open source projects to create the EKS Cluster and added custom Kubernetes controllers to deploy ArgoCD, manage external secrets, configure AWS Load Balancer, control DNS, push to CloudWatch, etc.
This step allowed us to not only migrate the applications, but also set up a GitOps workflow using ArgoCD, making things more efficient and empowering developers to have more control over their kubernetes deployments. Additionally, we configured GPU instances on kubernetes to run a very specific AI workload.
Furthermore, since one of the goals was to reduce operational costs, their hosted pipeline was migrated to a managed pipeline solution.
Just to mention a few other services involved in this phase:
- Amazon Elasticsearch Service
- Amazon Managed Streaming for Apache Kafka (Amazon MSK)
- System Manager (SSM Parameter Store)
- Amazon S3
DNX provided transfer knowledge throughout different sessions to enable the customer to understand AWS concepts to properly maintain the application safe, reliable, and at a predictable cost.
DNX also created cloud watch performance dashboards and set up alarms to inform Ferret of any potential problems and to keep the application safe.
Data Platform
Ferret was not able successfully migrate their massive on-premise datasets to Azure. In their initial attempt to migrate using their on-premise ETL tool (Pentaho Kettle), they faced multiple security and connectivity issues and the tool has been discarded. A second attempt was made using Azure Data Factory, however, the performance achieved seemed not feasible as the estimated time would take a whole week, and it failed multiple times. They ended up only uploading a small amount of data to enable development and testing, and raise the data migration as an issue and they have never been able to solve it.
DNX proposed the below architecture for the Data Platform. In this phase, we worked to Ferret’s instruction which was to configure their Mongodb cluster and permit network connectivity to their external tool, enabling them to feed their data.

They were using CosmoDB and a hosted Elasticsearch cluster at Azure. They were using an on-premise Pentaho Kettle to migrate the data.
We proposed S3 for staging, a managed Elasticsearch, and Atlas Mongo to replace CosmoDB. The initial recommendation was to use Glue for ETL. However, after further discussion with the customer, we decided to maintain the Pentaho ETL as Ferret’s developers were familiar with this tool and helped them to connect into our secure network to import the data and move it to the required endpoints.
Conclusion
Everything described in this case study was planned, designed, and implemented with key principles in mind, such as high availability, scalability, disaster recovery, security, elasticity, fault tolerance, and cost optimisation.
These new environments and AWS services provided Ferret the following benefits:
- Operational Excellence
By moving to AWS, DNX worked on the well-architect platform, enabling the customer to have an automated deployment process, self-healing applications, and more efficient application management. - Creating more cost-effective IT environments
Creating more cost-effective IT environments by optimizing compute, storage, and database costs or moving away from running their IT infrastructures with Azure, Ferret will spend 50% less on AWS fees on Kubernetes cluster with CPU and GPU using spot instances running a comparable infrastructure. - Improved time-to-market
We were able to reduce data migration time from 24h to 5h by using parallelism and optimising packet sizes, whilst reducing cost and complexity of the overall solution.
Shifting IT staff focus to differentiated work and strategic business initiatives, including substantial gains on average 62% more efficient and application developers will be 25% more productive with AWS. - Instilling IT and business operations
with the agility required to deliver cost-effective IT resources on an on-demand basis to address business opportunities as they arise, with interviewed organizations delivering almost three times more new application features with AWS, helping them win more business and increase revenue
Na DNX Brasil, rabalhamos para trazer uma melhor experiência em nuvem e aplicações para empresas nativas digitais. Trabalhamos com foco em AWS, Well-Architected Solutions, Containers, ECS, Kubernetes, Integração Contínua/Entrega Contínua e Malha de Serviços. Estamos sempre em busca de profissionais experiêntes em cloud computing para nosso time, focando em conceitos cloud-native. Confira nossos projetos open-souce em https://github.com/DNXLabs e siga-nos no Twitter, Linkedin or YouTube.
Appearition and its Omni platform for Immersive Technology
About Appearition
Appearition is an Australian technology company focused on creating immersive technologies that serve multiple industry verticals via a flexible API led interface.
Appearition was created with the purpose of providing cutting edge technology and also to be the agent of change by reducing the barriers to entry into immersive technologies for all sizes of enterprises. It is a group of passionate, purpose-driven individuals who are experts in immersive technology, agile software development, and product delivery.
They built the world’s first headless content management platform for immersive technologies that can serve multiple industry verticals via a flexible API led interface.
About Appearition and its Omni platform for Immersive Technology
The Appearition team created a modular platform solution in blocks that can be configured independently, client-by-client. The immersive experience management and deployment platform supports a variety of output types including wearable 3D headsets. They called it the Experience Management System (EMS).
The platform can be accessed through a portal, mobile applications, desktop applications, and/or development environments such as Unity. This allows clients to distribute the immersive experience through web, mobile, wearables, and embedded platforms.

The Business Challenge
The immersive technology platform was developed using .Net Framework in a very modular way. This has enabled Appearition to serve different verticals along the way. The solution follows an impressive modularisation strategy that encompasses source code, library, and state management as well.
Now, Appearition has decided to modernise the underlying platform, which to now was based on the .Net Framework. DNX Solutions was brought in to help Appearition design and plan a comprehensive rewrite to a new modern tech stack.
The main business driver is to base the platform on a sustainable architecture ready to support all business activities for the coming 5 to 10 years, giving the company a sustainable edge and allowing it to stay agile and competitive in the market. The following needs were identified when setting the requirements for the platform technology’s imminent evolution :
- Reduce time to market by automating as many steps in the release process as possible.
- Independence from any one platform. Make sure the solution is not coupled for example only to Windows, thereby avoiding Windows license costs just to execute the application.
- Leverage modern cloud technology to further reduce the Total Cost of Ownership for the solution.
- Reduce the man-hours necessary to maintain the solution by leveraging managed service as much as possible. So that Appearition talent can focus on creating more innovation in the immersive experience space.
Appearition has engaged DNX to assist in the analysis and planning on how to modernise their immersive application platform. DNX has executed a discovery project to co-create ways to achieve the desired goals.
The Discovery Solution
The Windows Discovery project took six weeks. DNX looked into Appearition’s needs through three lenses: business goals, technical feasibility, and team knowledge.
With well facilitated visual workshops, a team of cloud specialists from DNX together with technical and product SME from Apperarition have mapped the business necessities, the strengths of the current architecture, and defined the best ways to move from the current situation to a newer modern cloud architecture. The discovery has generated a modernisation technological roadmap that includes:
- a strategy to modernise the source code from .NET framework to .NET5
- a path to keep using the current and Long-Term Support version of .NET technologies
- training and guidance for Appearition technical team on modern cloud-native solutions
- the design of an application platform that is cloud independent; and
- a cost view on the modernisation effort as a project.
The Outcome
After the Windows Discovery project, Appearition now has a technology roadmap, a bounded context map, and an execution plan for their endeavour. What before was just a dream for the Appearition technical team is now an executable plan. And because DNX is an advanced AWS partner, the discovery project has demonstrated to Appearition the AWS funding that they could access to help accelerate their modernisation journey. The outcomes for a discovery project can vary a lot from one customer to another. For Appearition, the outcomes included a clearly defined and executable pathway towards:
- decoupling EMS application from windows servers
- upgrading the code base from .Net 4.6.1 to .Net 5 (targeting Docker® conternaisation)
- reducing the execution cost for the whole solution
- moving EMS architecture towards the next architecture that can handle the next 5-10 years of incoming requests from business areas
- Helping to communicate the Appearition architecture modernisation opportunity for investor funding; and
- upskilling Appearition technical team into the new tech stack.
Now, Appearition has a clear understanding of what success looks like; different modernisation scenarios to execute the project, better certainty in decision-making, and understanding of investment and AWS funding, enabling the customer to make better decisions aligned with its business strategy.
Na DNX Brasil, rabalhamos para trazer uma melhor experiência em nuvem e aplicações para empresas nativas digitais. Trabalhamos com foco em AWS, Well-Architected Solutions, Containers, ECS, Kubernetes, Integração Contínua/Entrega Contínua e Malha de Serviços. Estamos sempre em busca de profissionais experiêntes em cloud computing para nosso time, focando em conceitos cloud-native. Confira nossos projetos open-souce em https://github.com/DNXLabs e siga-nos no Twitter, Linkedin or YouTube.
Brighte Capital restructures its AWS organisations, improves security, and achieves a 50-60% cost reduction.

About Brighte
Brighte Capital is a rapidly growing Australian FinTech founded in 2015, making solar, battery, and home improvements affordable for Aussies all over the country.
Its mission is to make every home sustainable, offering Aussie families affordable access to sustainable energy solutions through an easy payment platform.
The company offers financing and zero-interest payment solutions for the installation of solar panels, batteries, air conditioning, and lighting equipment.
The process is simple and fast, all managed via Brighte’s website or smartphone app. Once your application is approved, you get access to highly vetted vendors offering interest-free products. Brighte recently received the Finder Green Awards 2021 in the category of Green Lender of the Year, an incredible achievement that recognises and solidifies its position in the Australian market.
As a company operating in both the Energy Industry and Financial Services Industry, Brighte must comply with numerous standards, rules, and regulations highlighting operations, security, and data protection as key topics. Australian Privacy Principles, Anti-Money Laundering and Counter-Terrorism Financing Act 2006, and National Consumer Credit Protection Act 2009 are just some examples.
But as a customer-centric company, Brighte goes beyond mere compliance requirements. Transparency and making life easier are two of its most important values, so Brighte is alert to other factors which can bring damage to their clients, well beyond compulsory minimum standards.
The Business Challenge: consolidate and improve the core digital platform architecture while prioritising security
Brighte’s business model is impressive and there has been considerable investment in a robust digital platform to support the different areas of the company. There is substantial technology in-place behind the scenes, with the business headed by a dedicated team of professionals with diverse backgrounds and skills, all contributing to a strong work culture.
As a relatively young company, Brighte has experienced exponential growth. Even with best practices in-place, it was difficult to continually manage or upgrade the various IT solutions the business was using.
Most of Brighte’s applications were developed in-house and based on a range of different programming languages and technologies. While its infrastructure was hosted on AWS, different services were being used to support each application, causing issues around ease of management and knowledge retention and sharing, but on top of that, increased vulnerability and manual interactions should have been fixed, retaining and improving security.
Brighte needed to revamp its landscape and reevaluate the current architecture of its core digital platform. The business reached out to DNX, seeking a solution that would improve its cloud strategy, apply DevOps best practices, reduce infrastructure operational overheads, and achieve overall cost optimisation. However, because of its financial conditions, these challenges need to go hand-in-hand with security. Therefore, DNX understood that the challenge is to provide those improvements while prioritising security.
The DNX Solution: infrastructure, pipelines, AWS Stack, deliverables, project, UI, frontend + backend
Prior to project kick-off, DNX began a discovery phase to maximise the information collected about the challenges faced by Brighte’s team. A Well-Architected Review Framework was delivered to identify risks and opportunities against operational excellence, security, reliability, performance efficiency, and cost optimisation pillars. This enabled DNX to ensure and maintain focus on the most important priorities, such as security and operational excellence, while the team went through the DevOps Transformation guidelines to draft a plan for the required changes, working towards continuous innovation during the course of the project.

Comparing best practices enables the team to identify new opportunities and highlight concerns that may not be apparent at the beginning.
From an infrastructure perspective, DNX recognised that Brighte needed to improve control over its AWS resources using IaC (Infrastructure as Code) and restructure its AWS organisation and accounts strategy.
To achieve this, DNX suggested its DNX.One Well-Architected Foundation (aka DNX.One) to provide the following benefits:
- New structure of AWS organisation following the best practices in the market.
- Ability to manage all infrastructure resources across all of their AWS accounts based on Terraform and CI/CD pipelines.
- Designed for AWS with Well-Architected principles
It is important to mention that DNX.One is a ready-to-go solution that aims to solve the most common business needs regarding cloud infrastructure, fitting different application architectures (including containers), has flexibility and automation for distinct platforms, and enhances management to keep business under control.
An extra layer of high-level security best practices as default for architecture guarantees continuous security at any stage. It ensures that regardless of the challenges that customers need to achieve, they will do it in a secure way.

From the applications point of view, DNX identified Brighte was using different types of AWS services to deploy their applications, including ElasticBeanstalk, ECS with Fargate, and EC2 instances.
Having these different types of application deployments is expensive, as the company needs to utilise multiple operational processes to manage the environment, but is also less secure because no single consistent security module is provided, effectively introducing risk.
With its Application Modernisation strategy, DNX suggested containerisation of the client’s main applications and deployment via ECS with spot instances. This change would substantially reduce Brighte’s costs, create a pattern for new applications that may be necessitated by future business growth, and improve security while having a single security pathway to improve the AWS responsibility under the Shared Responsibility Model, making security simpler by using ECS.
The CI/CD pipeline strategy was also evaluated and Brighte’s team demonstrated a willingness to adopt solutions that would reduce the complexity of managing new deployments and providing faster response times to deploy new applications in their landscape.
Key Project Phases:
Cloud Foundation (aka AWS Foundation)
With our automated solutions based on Terraform (IaC), DNX restructured Brighte’s AWS resources such as AWS organisation, accounts, network, domains, VPN, and all the security controls for account access via SSO using Azure AD as their Identity Provider.
Building a strong and secure foundation for Brighte’s applications was a critical first step prior to modernisation. With a multi-AZ strategy with ECS nodes running on spot instances deployed in their environments, Brighte was able to run a cluster of Docker containers across availability zones and EC2 instances, while optimising costs and simplifying the security operating model.

Security:
Although security is considered and addressed at many stages by now, and several cloud technologies have been put in-place to protect data, systems, and assets in a manner to improve security through best-practice guidance, there are some AWS services that still need to be highlighted.
AWS Cloudwatch
The logs from all systems, applications, and AWS services have been centralised in the highly scalable AWS CloudWatch service. It allows easy visualisation and filtering based on specific fields, or archiving them securely for future analysis. CloudWatch Logs enables you to see all of your logs, regardless of their source, as a single and consistent flow of events ordered by time, and you can query and sort them based on other dimensions, group them by specific fields, create custom computations with a powerful query language, and visualise log data in dashboards.
AWS Cloudtrail
All AWS events are reported to a centralised CloudTrail and exported to an S3 bucket in an Audit account.
AWS Organisations
The setup of new accounts has been automated by service control policies (SCPs) which apply permission guardrails at the organisation.
AWS Guardduty:
DNX implemented a centralised Guardduty to detect unexpected behaviour in API calls. The Amazon GuardDuty alerts when unexpected and potentially unauthorized or malicious activity occurs within the AWS accounts.
DNX has helped Brighte to strengthen its workload security along with a number of other relevant AWS resources, such as Amazon Cloudfront, ECR image scanners, AWS IAM identity provider, VPC endpoints, Amazon WAF, and AWS Systems Manager Parameter Store.
Cost savings:
There were three main cost optimisation drivers used for this project. The combined use of these three strategies brought savings in the order of 60%, compared with the same workloads on the previous environment, while allowing Brighte to use several new resources delivering more value with less cost to its clients.
- Using ECS clusters with EC2 Spot Instances: Spot instances are unused AWS capacity that is available for a fraction of the normal On-Demand prices on a bidding model. Spot instances can be reclaimed by AWS when there is no available capacity, so DNX uses an auto-scaling model with several instance types that ensure availability while saving around 75% compared with On-Demand. For instance, an On-Demand t3.xlarge instance costs $0.2112 per hour while the same Spot instance costs $0.0634.
- Savings plans for Databases: As the databases are stable and their use can be predicted over a long duration, AWS allows us to reserve a DB instance for one, two, or three years, with monthly or upfront payments, charging a discounted hourly rate saving from 30% to 60%, according to the chosen plan.
- Automatic scheduler for turning on and off resources according to a usage calendar: For Development and Testing environments, which are not meant to be used on a 24/7 basis, Brighte can easily schedule when these environments are available for the teams and when it should be turned off (scaling them to zero), saving around 50% compared to a full-time available environment. The scheduler mechanism allows the resources to be used at any desired time, bypassing the default calendar, in an easy to use way.
Application Modernisation:
Brighte had a good set of applications based on different technologies deployed across multiple AWS services. During this phase, the DNX team focused on the refactoring of the main applications to deploy the content via Docker containers and subsequently make use of ECS with spot instances.
They had previously adopted some of the 12-factor principles, but needed to improve their control over sensitive data and credentials. DNX proposed the use of AWS System Manager Parameter Store and adapted all the applications to follow this pattern.
A few serverless applications and UI static pages were deployed as part of this phase, even without demanding a strong code refactoring. We adapted the remaining apps to the 12-factor app methodology and made use of our CI/CD pipeline strategy.
Each environment in AWS was made identical, varying only in EC2 instance types in each environment (dev, uat, production). The same immutable application image was deployed and tested across these environments. By adopting this approach, Brighte has improved its operational resilience, greatly reducing production incidents to zero through its self-healing platform.
Logs:
Due to the high volume of logs, Brighte was using the ELK stack (ElasticSearch, Logstash, and Kibana) in legacy accounts to aggregate all of its application logs and avoid losing data during the process. The solution was working fine, but since it’s not a fully managed solution, the operational overhead was a point of impact.
DNX suggested the replacement of Logstash with Kinesis Firehose and CloudWatch Subscription Logs to send the data directly to ElasticSearch cluster. This way, Brighte was able to avoid the need of having dedicated resources to manage the solution and take advantage of the automatic transfer of logs between the applications, CloudWatch and ElasticSearch.

CI/CD pipeline:
Brighte was using Bitbucket as a provider for its applications pipelines. DNX adjusted the pipeline strategy reducing the complexity of deployments across different environments and included tools to automate the replacement of data used for automated tests using AWS System Manager Parameter Store. In addition, the bitbucket pipelines have been integrated with AWS using OpenID Connect (OIDC). As a result, there is no need for creating AWS IAM users and managing AWS Keys to access AWS resources. This strategy improved security and removed any kind of sensitive data from Brighte’s codebase.


Databases:
The databases were already deployed in RDS prior to this project, but DNX increased security by encrypting all of the database workloads and improving redundancy by activating Multi-AZ strategy during the database migration phase. Also, the databases were created in dedicated and isolated subnets which allow only incoming traffic from private subnets. Therefore, the network ACLS restricts inbound traffic for specific private subnet CIDR ranges and the RDS security groups allow only inbound traffic from ECS instances.

Conclusion
From conception to its conclusion, the project was completed in approximately five months, with the restructure of AWS accounts, infrastructure resources, and a total of 15 applications migrated to the new AWS environments.
The performance of the applications is working consistently based on auto-scaling of the clusters and without any risk of downtime due to the redundancy and self-healing strategies delivered by DNX products. The infrastructure and application deployment operational overhead has reduced significantly and this is reflected directly in Brighte’s ability to release products more frequently.
With the new pattern adopted across all applications and the use of ECS clusters with spot instances, Brighte has achieved a cost reduction of 50-60% – an outstanding result for such a large set of applications and infrastructure resources used by its digital platform.
Finally, having a very secure foundation helped Brighte to provide operational cost reduction through security and best practices, as Brighte fundamentally is saving money on operating it as the complexity was going down, therefore now they are able to run faster and safer.
Na DNX Brasil, rabalhamos para trazer uma melhor experiência em nuvem e aplicações para empresas nativas digitais. Trabalhamos com foco em AWS, Well-Architected Solutions, Containers, ECS, Kubernetes, Integração Contínua/Entrega Contínua e Malha de Serviços. Estamos sempre em busca de profissionais experiêntes em cloud computing para nosso time, focando em conceitos cloud-native. Confira nossos projetos open-souce em https://github.com/DNXLabs e siga-nos no Twitter, Linkedin or YouTube.