Skip to main content

41 posts tagged with "azure"

azure

View All Tags

Build Chatbot for Submerged Season for Coding

· 3 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

In the new Submerged season of FIRST LEGO League (FLL), we are continuing our journey of using AI to help us learn, innovate, and make robotics more accessible to everyone. Our team took the initiative to build a chatbot using Microsoft Azure (with lots of helps from coaches) and publish an app that anyone can use, free of charge. This was a big step toward creating a more inclusive, cost-effective way for young engineers and coders to get the most out of AI.

One of the challenges we faced when using tools like ChatGPT is that they often require users to create an account, and there are costs associated with using advanced AI features. This creates a barrier for many students and teams who might not have access to these resources. To solve this, we built a free version of our chatbot that specifically focuses on the needs of FLL participants. We locked down the scope of the questions and answers so that they only relate to the official season documents and guides for the Submerged challenge, as well as coding instructions for the LEGO SPIKE Prime 3.

The chatbot is designed to be simple and focused, giving users clear, relevant information about the FLL challenge and the Python coding required for SPIKE Prime 3 robots. Whether a team needs help understanding the mission details or figuring out how to make their robot move in a specific way, our chatbot has the answers.

alt text

alt text

One of the key skills we emphasized this season is prompt engineering. The art of asking the right questions to get the best answers from an AI. While many people assume that AI automatically gives perfect responses, the truth is that how you ask a question can dramatically affect the quality of the answer you receive. We've been teaching our team and users of the chatbot how to frame their prompts clearly and concisely to get more accurate and useful information.

For example, instead of asking, "How do I code my robot?" a better prompt might be, "Write Python code to move my SPIKE Prime 3 robot forward for 5 seconds and stop." By being specific, users can get actionable responses that directly help with their projects.

This new chatbot, combined with prompt engineering techniques, empowers young engineers to not only rely on AI for quick solutions but also to better understand how to interact with technology in ways that enhance their learning experience. We're excited to see how teams use this tool during the Submerged season and how it helps them innovate and succeed. We hope to remove some of the barriers that traditionally come with using advanced AI tools and give every FLL team a chance to harness the power of AI for their robotic challenges.

FLL Submerged Season Kickoff Tech Talk

· 3 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

I had the exciting opportunity to present a talk at the FIRST LEGO League (FLL) Australia season kickoff in August 2024, an event where young minds gather to prepare for a year full of innovation, teamwork, and, of course, robotics. My talk was titled, "ChatGPT, Write Python Code for My Robot, Please!", A topic that connected artificial intelligence, robotics, and education in a way that resonated with both the kids and mentors.

One of the most exciting things to prepare for this talk was building up a Python knowledge base specifically for the LEGO SPIKE Prime, the popular robotics kit used in FLL. SPIKE Prime uses a powerful hub and motors that can be programmed in Python. By leveraging ChatGPT's advanced language model and training it with SPIKE Prime's API and specifications, I was able to get it to generate Python code that effectively controls the robot's movements and sensors. This is a great option for kids (and even adults) who might struggle with the initial complexities of writing code from scratch.

alt text

For example, using simple natural language prompts like "write Python code to move the robot forward for 20 cm", ChatGPT can now produce highly accurate, ready-to-run Python scripts that control the motors of a LEGO robot.

During the presentation, I walked the audience through several real-world examples of how ChatGPT can assist in generating Python code for SPIKE Prime robots. I showed how, with minimal guidance, the AI could create scripts that made the robot move forward, turn, stop, and even detect objects using sensors. The kids were thrilled to see how a few simple prompts resulted in real-time robot movement, and their imaginations lit up at the possibilities of using AI to enhance their learning and problem-solving.

One key aspect of my talk was addressing the responsible use of AI as a learning tool. I emphasized how tools like ChatGPT should not be seen as a shortcut or replacement for understanding core concepts but rather as a tool to enhance learning. I encouraged the kids to experiment with the AI, ask it questions, and use it to break down complex problems, but to also remain curious about how the code works. This way, they could continue learning and developing their own skills while benefiting from the AI's assistance.

To make the talk even more engaging, I introduced two major figures in the tech world: Satya Nadella, CEO of Microsoft, and Sam Altman, CEO of OpenAI. I explained their roles in the development of AI and how their vision has contributed to making these cutting-edge technologies more accessible to everyone. The kids were fascinated to learn about the real people behind the tools they were interacting with and how AI is evolving to shape the future of not just robotics, but many industries.

The talk was super fun! The combination of showing how AI can be a practical, hands-on tool for robotics, while also discussing the broader implications of responsible AI use, made it a well-rounded experience. Watching the kids' excitement as they saw the robots come to life through AI-generated code was a highlight, and I hope they left feeling empowered to continue exploring the amazing world of robotics and AI.

Azure OpenAI Hackathon Winner

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Winner, Winner, Chicken Dinner! 🎉

I just received an email from the DevPost team, and guess what? My project is a winner in the Microsoft Azure OpenAI Hackathon! I'm beyond excited to share that my AI app made it to the top 4 places. What started as a quick experiment has turned into something truly rewarding!

It's incredible to think that this AI app, which I put together in a short time (a few long nights on the weendes), stood out among so many great projects. The voice interaction, speech output, and language translation features seem to have really struck a chord with the judges. I couldn't be happier with the outcome!

alt text

Now that the adrenaline of the win is starting to settle, I'm thinking about what's next. Winning is fantastic, but the real value lies in how we can take this project further and apply it in meaningful ways. One big question that's been on my mind: How can we use these AI tools to engage young students in coding and learning?

AI-powered apps, like the one I built, have the potential to transform education in meaningful ways. Imagine:

  • AI tutors that help students understand coding concepts step-by-step, making complex topics more approachable.
  • Interactive coding lessons where students can communicate with the bot using voice commands, making it more fun and less intimidating.
  • Multilingual support to help non-native English speakers learn to code in their preferred language, breaking down barriers to entry.
  • Instant feedback where students can ask the chatbot for coding help and get immediate answers, encouraging exploration and problem-solving.

There's so much potential to use AI to make learning more engaging, accessible, and fun. I'm excited to explore how I can turn this hackathon project into a real tool for education that sparks curiosity and creativity in young minds.

Onward to the next challenge! 🎉

Azure OpenAI Hackathon RAG Chatbot

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

The closing day of the Microsoft Azure OpenAI Hackathon is fast approaching, and I've just wrapped up Phase 2 of my project. It's been an exciting journey, and I've managed to add some really cool features that have made the chatbot even more powerful.

Here's what I've built on top of the base RAG chatbot using more Azure AI Services:

  • Voice Interaction: The chatbot can now accept voice commands, making it more interactive and user-friendly, especially in hands-free scenarios. This adds a whole new level of accessibility to the project.
  • Speech Output: In addition to text-based responses, the chatbot can now speak back the generated Python code or explanations. This is particularly useful for quick feedback or demonstrations.
  • Language Translation: I integrated a translation feature, allowing the chatbot to understand and respond in multiple languages. Whether you're coding in English, Spanish, or any other supported language, the chatbot can assist, making it more versatile and globally accessible.

These enhancements have transformed the chatbot into a more intuitive AI assistant that doesn't just help with coding but also communicates naturally with the user. I'm really excited to see how it performs in the hackathon!

With everything built and ready, it's time to submit the project and see what kind of feedback we get. The hackathon has been a fantastic learning experience so far, and I'm eager to hear what others think of these new features. Let's put the project to the test and see how it stacks up against the competition! Fingers crossed! 🤞

alt text

alt text

Build a LEGO Chatbot

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

In my latest experiment with AI-powered robotics, I decided to take all the API help specs from the Spike Prime 3 user guide and load them into Cosmos DB as a knowledge base. The goal was to use this data to generate better, more targeted Python code for Spike Prime robots using a LangChain RAG integration.

Here's the plan I followed:

  1. Extracting API Data: I pulled all the relevant API definitions and code snippets from the Spike Prime user guide. This included functions related to movement, turning, and sensor control.
  2. Cosmos DB as a Knowledge Base: I uploaded the extracted API documentation into Cosmos DB, setting it up as a searchable knowledge base. This would allow us to easily retrieve API definitions and related examples.
  3. LangChain Integration: The idea was to use LangChain's search capabilities to query Cosmos DB. The chatbot would then fetch relevant Python API definitions and example code snippets directly from the knowledge base.
  4. Generating Python Code: Once the relevant information was fetched, the chatbot would combine this with user input to generate Python code that could control the Spike Prime robot.

However, I ran into an unexpected issue: Despite having all the Spike Prime API information in Cosmos DB, the chatbot still ocassionally attempted to rely on external Python libraries from the internet instead of using the custom API definitions I had loaded. This led to code that wasn't compatible with the Spike Prime system, as those external libraries were not designed for LEGO robots.

This is a challenge I need to address. The chatbot is prioritizing internet-based solutions over the internal knowledge base, which defeats the purpose of having a custom-built, Spike Prime-specific AI assistant.

Next steps? I plan to tweak the LangChain logic to ensure the chatbot prioritizes fetching from Cosmos DB and only uses the API definitions we've provided. This will help ensure that the Python code generated is compatible with the Spike Prime robot. Stay tuned for updates on how I solve this!

Microsoft Azure OpenAI Hackathon

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Recently, I came across the Microsoft Azure OpenAI Hackathon and couldn't resist giving it a go! With the help of the detailed online instructions, I was able to follow along easily and dive into some really exciting AI technologies. The step-by-step tutorial provided a smooth learning experience, making the whole process engaging and informative.

During the hackathon, I gained valuable knowledge on several key technologies:

  • Cosmos DB: I learned how this scalable database can store and manage massive amounts of data, which is perfect for AI applications.
  • LangChain: This framework helped me understand how to integrate language models into real-world applications, particularly with the help of APIs and external data.
  • RAG (Retrieval-Augmented Generation): This was one of the most interesting parts! I built a chatbot that retrieves information from a knowledge base to give more accurate and relevant responses, blending both AI generation and data retrieval.

The experience got me thinking: how can I apply a similar approach to improve AI coding for LEGO Spike Prime robots using Python? Could we potentially use tools like Cosmos DB, LangChain, and Retrieval-Augmented Generation to enhance the way robots process commands and interact?

Imagine a chatbot-like interface for programming robots, where the AI retrieves the most efficient and specific commands from a knowledge base to guide the robot's movements and tasks. It opens up a world of possibilities! Could we fine-tune a model that combines language generation with real-time data, and use it to make robotics coding smarter and more intuitive?

The hackathon has definitely sparked some new ideas for how we can push the boundaries of AI-powered robotics programming. I'm eager to explore these possibilities further and see how we can take our LEGO robot AI coding to the next level!

Update: got my completion badge!

alt text

Azure Global Sydney 2024 Tech Talk

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

I'm excited to share that today I had the incredible opportunity to attend the Global Azure Sydney 2024 user group session at the Microsoft Reactor today! Even more thrilling, I got to present my experiences with AI and coding during the event. It was an amazing moment to share my journey, including how I've been exploring AI model fine-tuning and integrating it with robotics like Spike Prime. Feeling grateful for the chance to connect with so many passionate people in the field and exchange ideas!

alt text

alt text

alt text

alt text

Here's a quick overview of my talk and some key takeaways:

  • The Power of AI in Code Generation: I dive into how I've been using Azure's fine-tuning to create more efficient, domain-specific code for robotics.
  • From Generic Python to Robot-Specific Commands: I shared my experience building models that turn general Python code into Spike Prime-specific instructions.
  • Azure Endpoint Integration: I walked through how to set up and connect an AI model to a local environment, demonstrating real-time interactions with the Spike Prime robot.

The feedback I received was beyond amazing! Attendees from all over the world were interested in how AI can assist with robotics programming and coding efficiency. The response from the team at Microsoft was also incredibly positive, sparking some great conversations on future collaborations and projects.

I had a fantastic day at Microsoft Reactor in Sydney, a space filled with innovation and learning. The energy was contagious, and I left with so many new ideas and connections.


You can watch the full session here:

Watch My Azure Global Sydney 2024 Talk

Integrate GPT 3.5 model with Spike Prime 3

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

We've made great progress in our journey to fine-tune an AI model for Spike Prime robots. With the training dataset and validation dataset ready, it's time to take the next step: setting up an Azure endpoint and integrating it with our local Python script.

First, we configured the Azure endpoint to allow us to send and receive data. This will enable our fine-tuned model to process instructions in real-time. By doing this, we can interact with the model directly from our local machine, making it easy to integrate AI-powered code suggestions into our Spike Prime project.

For the robot connection, we used a serial port interface. This allowed us to send commands from the Python script running on our local machine to the robot in real-time. This setup ensures that we can test and execute movements and commands on the fly.

The results look promising so far. The fine-tuned model has demonstrated an ability to stick closely to the instructions we provided. It seems to understand the specifics of the Spike Prime movement commands, offering more accurate code suggestions than a generic model.

alt text

Here's a glimpse of what we've tried so far:

  • Moving forward: The model was able to generate precise movement commands using Spike Prime-specific syntax.
  • Turning: We tested multiple turning scenarios, and the model successfully adjusted the turning angle based on the instructions.
  • Combining movements: The fine-tuned model was able to chain together commands, making the robot perform complex maneuvers smoothly.

The integration between Azure and our local environment is working well, and the fine-tuned model is proving to be a valuable tool in generating Spike Prime-specific Python code. As we continue to refine the model, I'm excited to see how far we can take it!

Use fune-tuned gpt 3.5 model to write python

· 2 min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Over the last 12 months, I've been amazed by all the developments happening in the AI world. From advancements in natural language processing to AI-driven coding tools, it's clear that AI is transforming the tech landscape faster than ever. With all the buzz, I decided to set a challenge for myself: to build an AI app from scratch.

I've heard a lot about is Azure's fine-tuning capabilities. The idea behind fine-tuning is to take a generic AI model and specialize it for a particular task or domain. In my case, I want to see if I can fine-tune a model to improve the accuracy of Python coding snippets, specifically for Spike Prime robots.

alt text

Spike Prime is a versatile LEGO robotics kit that allows users to program movements and interactions. While generic Python code can be used to program the robots, I'm curious if fine-tuning a model on Azure can make the code more optimized for Spike Prime's specific needs.

This will be my first time building a fine-tuned model on Azure, so I want to make the process as smooth as possible. Instead of trying to fine-tune every Python function related to Spike Prime, I've decided to narrow the scope and focus only on a few scenarios, such as robot movement and turning.

The fine-tuning process for this project looks like this:

  1. Dataset preparation: Gather Python code snippets that are specifically tailored for Spike Prime, focusing on movement and turning functions.
  2. Fine-tuning: Use Azure's fine-tuning service to train the model with this specialized dataset.
  3. Testing: Evaluate the fine-tuned model by inputting generic Python code and observing how it converts the code to a more Spike Prime-specific version.

This is just the beginning, and I'm excited to see what kind of improvements we can achieve. Stay tuned as I dive deeper into the world of AI fine-tuning and robotics!

(Notice) Azure Global Sydney 2024 Tech Talk

· One min read
Daniel Fang
AI, Robotics & LEGO Enthusiast

Join me for an exciting session where we'll have some fun with Lego SpikePrime Robot and AzureOpenAI. I'm proud to be presenting at Global Azure Sydney 2024 - Microsoft Reactor Sydney @ 20/04/2024.

📌 Session: Fine-tune GPT-3.5 model to control Lego Robot using Azure OpenAI & Python 🗣️ Speaker: Daniel Fang

Session details: https://sessionize.com/view/rjfzv8k0/GridSmart?format=Embed_Styled_Html&isDark=False&title=Global%20Azure%20Sydney%202024 Reserve your spot: https://www.tickettailor.com/events/azuresydneyusergroup/1193113

alt text

alt text

alt text

alt text

Read more via LinkedIn Post