Skip to main content

FLL Masterpiece APOC AI Innovation Project

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Team Cleverbots embarked on an exciting journey this year, advancing through both regional and national competitions. Their dedication and hard work paid off when they were invited to participate in the prestigious international competition for the Masterpiece season. As part of their innovation project, I had the privilege of helping them develop a web app. The app, powered by an AI chatbot, is designed to inspire people to explore new hobbies through personalized recommendations, making it easier for users to discover activities they might not have otherwise considered.

As the coach of Team Cleverbots, I work closely with a group of seven enthusiastic Year 7 and 8 girls. Their drive and passion for technology are remarkable, and together they developed an innovative application called "Fresh - Hobby Sharing AI App." During the four-day international event, I supported the team by delivering presentations, demonstrating the app's unique features, and helping them engage with participants from across the globe. The event was a melting pot of creativity, featuring robot competitions and team-building exercises, with 52 teams representing 26 different countries. It was a fantastic opportunity for the girls to showcase their skills and creativity on a global stage.

This international event not only served as a platform to demonstrate the power of AI in innovation but also sparked meaningful conversations among coaches, educators, and participants. We discussed the implications of AI in education, how it can shape the future, and the importance of ensuring that young students are empowered to create with these technologies. The event highlighted how technology can be a tool for both personal growth and global collaboration, and I'm incredibly proud to have been part of such an inspiring experience with Team Cleverbots.

alt text

alt text

alt text

FLL Masterpiece APOC (International)

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Hooray! We Are Invited to the FLL Asia Pacific Open Championship! 馃帀

I'm thrilled to share that Team Cleverbots was invited to the FIRST LEGO League (FLL) Asia Pacific Open Championship at Macquarie University! With 52 teams from 26 different countries, it's an incredible opportunity to compete, share knowledge, and, of course, have some serious fun with LEGO robots.

The event spans four amazing days, and it's been nothing short of spectacular. We've competed in intense challenges, presented our innovative projects, and celebrated every moment with singing, dancing, and chanting! The energy here is off the charts, and it's been such a joy to connect with teams from all over the world.

What makes this event even more special is the sense of community and collaboration. Teams from across the globe are not just competing, but sharing their strategies, learning from one another, and building friendships. It's a beautiful mix of cultures, technology, and creativity all coming together through the love of robotics.

Every minute of the championship has been a blast, from the thrilling competition rounds to the downtime where we bond over common interests and celebrate each other's achievements. Seeing people from so many countries come together over a shared passion for LEGO robots has been truly inspiring.

alt text

alt text

Whether it's working on our robot's strategy or cheering each other on during matches, this experience has been one for the books. Here's to the amazing memories, new friendships, and endless fun we've had at the FLL Asia Pacific Open Championship. We've enjoyed every second of it!

Azure OpenAI Hackathon Winner

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Winner, Winner, Chicken Dinner! 馃帀

I just received an email from the DevPost team, and guess what? My project is a winner in the Microsoft Azure OpenAI Hackathon! I'm beyond excited to share that my AI app made it to the top 4 places. What started as a quick experiment has turned into something truly rewarding!

It's incredible to think that this AI app, which I put together in a short time (a few long nights on the weendes), stood out among so many great projects. The voice interaction, speech output, and language translation features seem to have really struck a chord with the judges. I couldn't be happier with the outcome!

alt text

Now that the adrenaline of the win is starting to settle, I'm thinking about what's next. Winning is fantastic, but the real value lies in how we can take this project further and apply it in meaningful ways. One big question that's been on my mind: How can we use these AI tools to engage young students in coding and learning?

AI-powered apps, like the one I built, have the potential to transform education in meaningful ways. Imagine:

  • AI tutors that help students understand coding concepts step-by-step, making complex topics more approachable.
  • Interactive coding lessons where students can communicate with the bot using voice commands, making it more fun and less intimidating.
  • Multilingual support to help non-native English speakers learn to code in their preferred language, breaking down barriers to entry.
  • Instant feedback where students can ask the chatbot for coding help and get immediate answers, encouraging exploration and problem-solving.

There's so much potential to use AI to make learning more engaging, accessible, and fun. I'm excited to explore how I can turn this hackathon project into a real tool for education that sparks curiosity and creativity in young minds.

Onward to the next challenge! 馃帀

Azure OpenAI Hackathon RAG Chatbot

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

The closing day of the Microsoft Azure OpenAI Hackathon is fast approaching, and I've just wrapped up Phase 2 of my project. It's been an exciting journey, and I've managed to add some really cool features that have made the chatbot even more powerful.

Here's what I've built on top of the base RAG chatbot using more Azure AI Services:

  • Voice Interaction: The chatbot can now accept voice commands, making it more interactive and user-friendly, especially in hands-free scenarios. This adds a whole new level of accessibility to the project.
  • Speech Output: In addition to text-based responses, the chatbot can now speak back the generated Python code or explanations. This is particularly useful for quick feedback or demonstrations.
  • Language Translation: I integrated a translation feature, allowing the chatbot to understand and respond in multiple languages. Whether you're coding in English, Spanish, or any other supported language, the chatbot can assist, making it more versatile and globally accessible.

These enhancements have transformed the chatbot into a more intuitive AI assistant that doesn't just help with coding but also communicates naturally with the user. I'm really excited to see how it performs in the hackathon!

With everything built and ready, it's time to submit the project and see what kind of feedback we get. The hackathon has been a fantastic learning experience so far, and I'm eager to hear what others think of these new features. Let's put the project to the test and see how it stacks up against the competition! Fingers crossed! 馃

alt text

alt text

FLL Masterpiece APOC Ahead

One min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Hooray! 馃帀 Our girls' FIRST LEGO League team is heading to the Asia Pacific Open Championships (APOC https://www.firstaustralia.org/asia-pacific-open-championships) this July!

It's been an incredible journey through last year's regional and national competitions. As their coding mentor (still learning my ways), I rebuilt the FLL competition table at home for their extra Robot Game practice over the weekends.

馃審Now, it's the time for international!

alt text

Read more via LinkedIn Post

Build a LEGO Chatbot

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

In my latest experiment with AI-powered robotics, I decided to take all the API help specs from the Spike Prime 3 user guide and load them into Cosmos DB as a knowledge base. The goal was to use this data to generate better, more targeted Python code for Spike Prime robots using a LangChain RAG integration.

Here's the plan I followed:

  1. Extracting API Data: I pulled all the relevant API definitions and code snippets from the Spike Prime user guide. This included functions related to movement, turning, and sensor control.
  2. Cosmos DB as a Knowledge Base: I uploaded the extracted API documentation into Cosmos DB, setting it up as a searchable knowledge base. This would allow us to easily retrieve API definitions and related examples.
  3. LangChain Integration: The idea was to use LangChain's search capabilities to query Cosmos DB. The chatbot would then fetch relevant Python API definitions and example code snippets directly from the knowledge base.
  4. Generating Python Code: Once the relevant information was fetched, the chatbot would combine this with user input to generate Python code that could control the Spike Prime robot.

However, I ran into an unexpected issue: Despite having all the Spike Prime API information in Cosmos DB, the chatbot still ocassionally attempted to rely on external Python libraries from the internet instead of using the custom API definitions I had loaded. This led to code that wasn't compatible with the Spike Prime system, as those external libraries were not designed for LEGO robots.

This is a challenge I need to address. The chatbot is prioritizing internet-based solutions over the internal knowledge base, which defeats the purpose of having a custom-built, Spike Prime-specific AI assistant.

Next steps? I plan to tweak the LangChain logic to ensure the chatbot prioritizes fetching from Cosmos DB and only uses the API definitions we've provided. This will help ensure that the Python code generated is compatible with the Spike Prime robot. Stay tuned for updates on how I solve this!

Microsoft Azure OpenAI Hackathon

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Recently, I came across the Microsoft Azure OpenAI Hackathon and couldn't resist giving it a go! With the help of the detailed online instructions, I was able to follow along easily and dive into some really exciting AI technologies. The step-by-step tutorial provided a smooth learning experience, making the whole process engaging and informative.

During the hackathon, I gained valuable knowledge on several key technologies:

  • Cosmos DB: I learned how this scalable database can store and manage massive amounts of data, which is perfect for AI applications.
  • LangChain: This framework helped me understand how to integrate language models into real-world applications, particularly with the help of APIs and external data.
  • RAG (Retrieval-Augmented Generation): This was one of the most interesting parts! I built a chatbot that retrieves information from a knowledge base to give more accurate and relevant responses, blending both AI generation and data retrieval.

The experience got me thinking: how can I apply a similar approach to improve AI coding for LEGO Spike Prime robots using Python? Could we potentially use tools like Cosmos DB, LangChain, and Retrieval-Augmented Generation to enhance the way robots process commands and interact?

Imagine a chatbot-like interface for programming robots, where the AI retrieves the most efficient and specific commands from a knowledge base to guide the robot's movements and tasks. It opens up a world of possibilities! Could we fine-tune a model that combines language generation with real-time data, and use it to make robotics coding smarter and more intuitive?

The hackathon has definitely sparked some new ideas for how we can push the boundaries of AI-powered robotics programming. I'm eager to explore these possibilities further and see how we can take our LEGO robot AI coding to the next level!

Update: got my completion badge!

alt text

Azure Global Sydney 2024 Tech Talk

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

I'm excited to share that today I had the incredible opportunity to attend the Global Azure Sydney 2024 user group session at the Microsoft Reactor today! Even more thrilling, I got to present my experiences with AI and coding during the event. It was an amazing moment to share my journey, including how I've been exploring AI model fine-tuning and integrating it with robotics like Spike Prime. Feeling grateful for the chance to connect with so many passionate people in the field and exchange ideas!

alt text

alt text

alt text

alt text

Here's a quick overview of my talk and some key takeaways:

  • The Power of AI in Code Generation: I dive into how I've been using Azure's fine-tuning to create more efficient, domain-specific code for robotics.
  • From Generic Python to Robot-Specific Commands: I shared my experience building models that turn general Python code into Spike Prime-specific instructions.
  • Azure Endpoint Integration: I walked through how to set up and connect an AI model to a local environment, demonstrating real-time interactions with the Spike Prime robot.

The feedback I received was beyond amazing! Attendees from all over the world were interested in how AI can assist with robotics programming and coding efficiency. The response from the team at Microsoft was also incredibly positive, sparking some great conversations on future collaborations and projects.

I had a fantastic day at Microsoft Reactor in Sydney, a space filled with innovation and learning. The energy was contagious, and I left with so many new ideas and connections.


You can watch the full session here:

Watch My Azure Global Sydney 2024 Talk

Integrate GPT 3.5 model with Spike Prime 3

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

We've made great progress in our journey to fine-tune an AI model for Spike Prime robots. With the training dataset and validation dataset ready, it's time to take the next step: setting up an Azure endpoint and integrating it with our local Python script.

First, we configured the Azure endpoint to allow us to send and receive data. This will enable our fine-tuned model to process instructions in real-time. By doing this, we can interact with the model directly from our local machine, making it easy to integrate AI-powered code suggestions into our Spike Prime project.

For the robot connection, we used a serial port interface. This allowed us to send commands from the Python script running on our local machine to the robot in real-time. This setup ensures that we can test and execute movements and commands on the fly.

The results look promising so far. The fine-tuned model has demonstrated an ability to stick closely to the instructions we provided. It seems to understand the specifics of the Spike Prime movement commands, offering more accurate code suggestions than a generic model.

alt text

Here's a glimpse of what we've tried so far:

  • Moving forward: The model was able to generate precise movement commands using Spike Prime-specific syntax.
  • Turning: We tested multiple turning scenarios, and the model successfully adjusted the turning angle based on the instructions.
  • Combining movements: The fine-tuned model was able to chain together commands, making the robot perform complex maneuvers smoothly.

The integration between Azure and our local environment is working well, and the fine-tuned model is proving to be a valuable tool in generating Spike Prime-specific Python code. As we continue to refine the model, I'm excited to see how far we can take it!

Use fune-tuned gpt 3.5 model to write python

2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Over the last 12 months, I've been amazed by all the developments happening in the AI world. From advancements in natural language processing to AI-driven coding tools, it's clear that AI is transforming the tech landscape faster than ever. With all the buzz, I decided to set a challenge for myself: to build an AI app from scratch.

I've heard a lot about is Azure's fine-tuning capabilities. The idea behind fine-tuning is to take a generic AI model and specialize it for a particular task or domain. In my case, I want to see if I can fine-tune a model to improve the accuracy of Python coding snippets, specifically for Spike Prime robots.

alt text

Spike Prime is a versatile LEGO robotics kit that allows users to program movements and interactions. While generic Python code can be used to program the robots, I'm curious if fine-tuning a model on Azure can make the code more optimized for Spike Prime's specific needs.

This will be my first time building a fine-tuned model on Azure, so I want to make the process as smooth as possible. Instead of trying to fine-tune every Python function related to Spike Prime, I've decided to narrow the scope and focus only on a few scenarios, such as robot movement and turning.

The fine-tuning process for this project looks like this:

  1. Dataset preparation: Gather Python code snippets that are specifically tailored for Spike Prime, focusing on movement and turning functions.
  2. Fine-tuning: Use Azure's fine-tuning service to train the model with this specialized dataset.
  3. Testing: Evaluate the fine-tuned model by inputting generic Python code and observing how it converts the code to a more Spike Prime-specific version.

This is just the beginning, and I'm excited to see what kind of improvements we can achieve. Stay tuned as I dive deeper into the world of AI fine-tuning and robotics!