Skills

Here are some of the tools, languages, and platforms I frequently work with to deliver data-driven solutions.

Programming Languages & Databases

  • Python (focused on data analysis).
  • SQL for data extraction.
  • PostgreSQL, MySQL, Oracle, and SQLite databases.
  • Google Apps Script.
  • Web scraping with Beautiful Soup.
  • R for statistical modeling.
  • DAX (Data Analysis Expressions).

Spreadsheets, CRM, & Data Visualization Tools

  • HubSpot (Marketing, Sales, Service, and Operations).
  • Google Sheets and Looker Studio.
  • Microsoft Excel and Power BI.
  • Visualization libraries: Matplotlib, Seaborn, Plotly, and Folium.

Software Engineering

  • Version control with Git and GitHub.
  • Web app development with Streamlit.
  • Application hosting on Heroku, Streamlit Cloud, Google Cloud Platform (GCP), Amazon AWS, and Microsoft Azure.

Statistics & Machine Learning

  • Descriptive and inferential statistics.
  • Regression, classification, and clustering algorithms.
  • Model performance metrics: RMSE, MAE, MAPE, Confusion Matrix, Precision, Recall, ROC Curve, AUC, Silhouette Score, F1-Score.

Professional and Academic Experiences

HubSpot CRM Analyst at Hook Digital

In this most recent role, I work as a CRM Analyst at Hook Digital, a HubSpot partner tech agency focused primarily on system integrations and migrations into HubSpot. I have worked both as an analyst and as a consultant across implementation/migration projects, audits, and advisory engagements — understanding client pain points, designing and building solutions, training teams, and performing data migrations.

I also take part in our internal team that supports international clients, working on projects that require English for writing, reading, listening, and speaking.

The main impacts I’ve delivered across the projects I’ve worked on include:

  • Improving CRM data quality and reliability.
  • Strengthening operational alignment between Marketing, Sales, and Customer Success by structuring clear, centralized processes.
  • Increasing team efficiency by implementing automations that replaced manual work.
  • Enabling better decision-making by delivering clear insights and dashboards.
  • Executing secure migrations with no data loss and providing data-quality dashboards to ensure operational continuity.

The clients I’ve supported span several industries, including healthcare, education, construction, metallurgy, banking and financial services, technology, retail, and others.

Business Analyst at Panificadora Sabores & Massas

I also work as an independent business analyst for a small bakery where I previously worked as a sales clerk for a little over three years. I’m responsible for collecting and analyzing data, generating reports, and supporting decision-making alongside the business owner.

Data Analyst Intern at Rehagro

I also worked as a Data Analyst Intern at Rehagro, an education company focused on agribusiness. I was assigned to the corporate sales department, where my main responsibilities included: creating and automating spreadsheets using Google Sheets and Apps Script, maintaining dashboards in Power BI, building, evaluating, and maintaining pipelines and dashboards in HubSpot, cleaning objects in HubSpot, supporting the Prospecting team with bulk lead generation using Python, and assisting the Management and HR team with registering and updating roles, skills, and employee records in the Sólides platform.

Counter Clerk at Panificadora Sabores & Massas

Also at the bakery mentioned above, I mainly worked as a counter clerk and cashier. I was responsible for leading a counter team of seven, managing communication between the counter and kitchen teams, and overseeing the company’s management system. Under my leadership, we also introduced a delivery service and increased the company’s presence on social media.

Data Scientist (Apprentice) at Comunidade DS

For over two years, I took part in a learning community focused on Data professionals, especially Data Scientists and Data Analysts. During this time, I completed several projects, joined hackday competitions, participated in live sessions and mentorships, and earned multiple knowledge certificates.

Bachelor’s Degree at Cruzeiro do Sul

I’m in my final semester of a Bachelor’s degree in Data Science at Cruzeiro do Sul, where I gained the opportunity to pursue an internship and fully enter the data field. During my studies, I was introduced to programming languages such as Python and R, Oracle databases, concepts of data structure, data mining and modeling, statistical inference, exploratory data analysis, Business Intelligence, operating systems, cloud computing, artificial intelligence, applied programming, information modeling and visualization, big data, machine learning, neural networks, predictive analysis, and classification.

Integrated Technical Program at IFRN

During my time as a Information Technology (IT) student at the Federal Institute of Education, Science and Technology of Rio Grande do Norte (IFRN), I gained hands-on experience with a wide range of tools and technologies relevant to the IT field. I was introduced to programming and scripting languages such as C, C++, C#, JavaScript, HTML, CSS, and SQL, as well as MySQL databases. I also learned about computer maintenance, network engineering, electricity, and electronics concepts.

At the end of the program, I developed (in a team of two) my final project entitled “Do The Evolution!: A Gamification Proposal for Teaching the Theory of Evolution in Schools”, a virtual game designed as an educational tool to support teaching Evolution and demonstrate the benefits of gamification in education. The project was implemented using Unity as the game engine, C# (C Sharp) as the programming language, and Figma as the design editor.

Projects

Analysis and Classification of Properties for Buying and Selling

The project aims to answer two questions posed by the business team of a fictional real estate company called House Rocket Company: 1) Which properties should House Rocket purchase, and at what price? 2) Once purchased, what is the best time to sell these properties, and at what price? To answer these questions, a dataset from Kaggle was collected and analyzed.

Tools Used:

  • Python 3.8.0.
  • Jupyter Notebook.
  • PyCharm.
  • Streamlit and Streamlit Cloud.
  • Descriptive statistics.
  • Git and Github.

Market Research for Jeans

This project involved data collection from a website (web scraping) and subsequent analysis to answer questions posed by the owners of Star Jeans Company, a fictional company preparing to enter the e-commerce fashion retail sector. The questions were: 1) What is the optimal selling price for the jeans? 2) Which types of jeans and colors should be included in the initial product line? 3) What raw materials are needed to manufacture the jeans?

Tools Used:

  • Python 3.8.0.
  • Jupyter Notebook.
  • PyCharm.
  • ETL processes.
  • Beautiful Soup.
  • Cron Job.
  • SQLite.
  • Streamlit and Streamlit Cloud.
  • Descriptive statistics.
  • Git and Github.

Data Engineering and Analysis at a Small Bakery

With the goal of 1) encouraging a small business to adopt technology in its daily operations and 2) helping the business gain better insights about itself through data, an initial data engineering setup was implemented at Panificadora Sabores & Massas. This setup allowed the company to start collecting data daily and generating its first analyses and reports on a regular basis, as well as providing examples of decisions made based on this data.

Tools Used:

  • Google Sheets.
  • Google Apps Script.
  • Google Looker Studio.
  • Git and Github.

Get in Touch