Robot Industry Blog

ARCx Community Introduction

About Synthiam

Synthiam Inc. is a pioneering force in automation, artificial intelligence (AI), and technology business integration, dedicated to making advanced technologies accessible to a broad audience. The company's mission is to democratize AI and robotics, enabling businesses of all sizes to enhance efficiency and unlock new potential.

Synthiam simplifies the integration of AI and robotics into business operations, catering to users without a deep technical background in robotics. This approach has made Synthiam a hub for innovation, supported by a vibrant community of developers and technology enthusiasts.

The core of Synthiam's offerings is a suite of software solutions that facilitate intuitive human-machine interactions, allowing for the development of customized robotics applications tailored to streamline operations, reduce costs, and improve service delivery. This commitment to providing cutting-edge technology is supported by ongoing research and development, ensuring clients access to the most advanced tools in AI and robotics.

Beyond software, Synthiam offers comprehensive consulting services to guide businesses through identifying and implementing automation and AI solutions. From conceptualization to execution, Synthiam provides end-to-end support to ensure successful technology integration.

Synthiam also prioritizes education and community engagement, offering resources and workshops to foster a culture of learning and innovation. This commitment extends its impact beyond corporate transformation to empower individuals to contribute to the technological future.




Synthiam is creating a new version of ARC called ARCx

The new version is cross-platform to run on Linux (Ubuntu) for x86 or arm64, Windows 64, and macOS arm64. The new synthiam ARCx aims to have a robot development environment that accommodates users from education, DIY, and small/medium businesses. The x in the name ARCx represents the versatility of the platform by not only targeting multiple user groups and many technologies.

The current ARC had many limitations for scaling a robot from an idea into a product. It also required the GUI to be present at all times, which meant there were no background services to run on a headless robot. It also needed Windows, which excluded low-cost and low-powered devices such as Raspberry Pi.

The new ARC resolves these by providing a customizable user interface using razor technology, a web GUI development front end for creators and programmers. It can run on a variety of single-board computers.

As original Synthiam ARC had evolved from EZ-Builder, we've witnessed the evolution of several technologies that used to be premature back in the day. Today, many technologies have proven themselves and become standard practice in DIY, education, and enterprise. Specifically, technologies include Python, javascript, ARM64 processors, JSON, interactive server-side web guys like Blazor, standardized Linux distributions like Ubuntu, and multiple Arduino-compatible microcontrollers. Because these technologies have proven stable and are an industry standard, Synthiam has confidence when developing around these technologies.

Today is the perfect convergence of standardized technologies and consumer knowledge of those technologies. This is why we picked now to begin developing the next generation of the world's most accessible robot programming platform, ARCx.

Over the last 14 years since the first version of ARC's predecessor, eZ-Builder, there have been millions of robot connections to the platform. Throughout that time, hundreds of thousands of comments and conversations have shared user experiences through questions and feedback on the website forum. While the Synthiam team keeps a close eye on user activity to continue developing new features, we also have a secret weapon that has helped us produce new features that ARCx will provide.

Synthiam's secret weapon is Athena, our purpose-built AI that many know as our customer support agent. While Athena's knowledge base continues to grow, she is not limited to helping customers build robots. Athena is also used internally to help design features, architect programming APIs of the ARCx robot platform, and more.

When developing ARCx, we fed Athena's artificial intelligence algorithm the history of conversations on the forum from users like you. This has allowed us to architect features you have discussed or shown interest in.






Cross Platform

Linux and MacOS have always played catch-up to the features of Microsoft Windows, which made supporting those operating systems taxing on the company's developer resources. In most cases, custom solutions must be created between operating systems. And that cannot be easy to maintain with Synthiam ARC's vast feature list. While other operating systems mature, Synthiam must constantly monitor and update the ARC platform for compatibility.

Today, we hope there is enough cross-platform support between Windows, Linux 64, Linux ARM64, and MacOS. During ARCx development, our primary concerns were being able to easily support USB Cameras, vision processing, and Audio Input/Output. It took significant research and testing to find libraries that allowed a standard cross-platform API. Specifically, one of the difficulties was being able to enumerate USB video devices across platforms for Robot Skill developers to easily access independent of the operating system.

Because ARCx is designed to provide features in the form of robot skills, we needed the robot skills to be cross-platform. If someone built a robot skill for ARCx, it should be able to work on any supported operating system.

We know that some robot skills will use features that might not be supported across all operating systems. The robot's skill might only be compatible with one operating system. For example, suppose the robot skill uses a Linux feature unavailable in Windows or MacOS. In that case, the robot's skill will not be available to those running other operating systems.

When creating a robot skill, the author can define what operating systems are supported.






Vision Processing

Synthiam has historically used aForge as a video processing library, which had some great features, such as blobs and filters. The aForge library and Windows have always stored images in a Bitmap object, managed by an API embedded deep within Windows OS. In the latest releases of .NET Core, Microsoft has deprecated the Bitmap object for cross-platform. (https://learn.microsoft.com/en-us/dotnet/core/compatibility/core-libraries/6.0/system-drawing-common-windows-only).

This put us in a bit of a bind as we struggled to find a library that was easy to use and had licensing to let us distribute the library for others to program with. We have watched OpenCV grow throughout the years, and after review, we have determined its maturity has made it easy for robot skill developers to use. Synthiam has always been focused on making development easy so people can be creative. This applies to robot skill developers as well.

We have also been working on exposing OpenCV native functions to the ARCx JavaScript and Python runtimes. This means you will have access to process video directly from scripts rather than build custom robot skills.






Web Based

Synthiam ARC has always had a graphical user interface that ran natively as an application. So, we have been watching many cross-platform GUI solutions, such as Xamarin, Avalonia, Maui, GTK, and others. But throughout our tests over the years, something didn't sit right with us.

It wasn't just the libraries not having a mature framework for us to trust; the whole idea of ARCx as an application didn't sit right. When ARC was closed, it stopped working. If the UI had issues, ARC may crash as well.

This is when the release of .NET 8 and Blazor inspired a prototype that resulted in the framework for ARCx. We had to push the limits of Blazor's work by having workers and UIs for robot skills. We wanted a robot skill to continue running even if the UI was closed or crashed. With mission-critical tasks of the robot, the core processing/worker needed to be isolated from the user interface.

We designed a model that allows each robot skill to have a worker, a UI, and a configuration screen. By separating these functions, there are no dependencies to crash the worker from the UI.

You can close your web browser, and the robot continues running in the background. This also means the robot does not need a monitor or screen, as the robot is programmed using a web browser.






Remote Programming From Any Device

Continuing from the previous feature of being web-based, this allows remote access from anywhere in the world with a web browser. You no longer need to use VNC or a remote desktop to connect to the robot for programming or remote control.

Using a web browser from your tablet, mobile phone, PC, Television, or game console, you have access to the ARCx interface.

The web browser renders the ARCx user controls using HTML, which means the interface scales for any screen resolution. You can increase the zoom ratio in the web browser if you prefer larger objects due to poor eyesight or smaller displays. The same applies to decreasing zoom to fit more on the display.

When ARCx loads, it determines your web browser's theme between dark or light mode. This feature can also be overridden in the ARCx options menu. This allows ARCx to seamlessly fit within your environment to continue supporting the creative process.






Custom User Interfaces

ARC has provided a custom user interface designer that allows you to create rudimentary interfaces from a small selection of graphic components. Interfaces were limited to the ARC components and displayed within the ARC application.

ARCx dramatically improves on this approach by providing custom user interfaces in HTML. This means you can easily create user interfaces with the components provided by ARC and the MudBlazor UI library. You will essentially be designing your interface, which will appear as an app for any user who controls your robot. This hides the programming interface and limits users' access to the remote control interfaces.






Encrypt Robot Projects

Save your robot projects with an encryption key to protect the program. ARCx uses 2048-bit RSA encryption to protect your project from anyone else accessing it. You can now safely and securely save your project to the cloud or accidentally leave it on a USB stick without anyone being able to reverse-engineer your effort.






New Control Command

The ControlCommand feature of ARC allows robot skills to send commands to other robot skills. For example, a speech recognition robot skill can send a command to the camera robot skill to begin tracking the color red when someone speaks a phrase.

ARCx extends the control command by allowing the ControlCommand to receive values. This means the ControlCommand can query values from other robot skills rather than the data being published as global variables. For example..

[code]

var CameraStatus = ControlCommand("Camera", "IsActive");

print("The camera status is: " + CameraStatus);

[/code]

The control commands are displayed in a tab beside the editor when editing the script. A new feature provides hover help for every control command. This means you can quickly hover to see what the control commands do without viewing the online manual for the robot skill.






Athena Built-In

We all know and love Athena as Synthiam's support agent on the Community Forum. We have integrated Athena into ARCx, which allows you to tap into her knowledge to assist with programming your robot. We have created templates that help guide your question so she can best understand how to assist. The templates are organized to help you with scripting, finding robot skills, mechanical design, electronics, sensor integration, and more.






Ports Ports and more Ports!

ARC was limited to 24 digital ports, 8 ADC ports, and 24 servo ports. While these limits are fine for most robot controllers and microcontrollers, we know how hardware changes. Internally, ARCx has removed the limit for the number of ports per EZB index, even though all microcontrollers have a limit. For the user interface, we added 100 of each type for simplicity.

In addition to increasing the number of available ports, we added the ability to add descriptions to ports for each EZB so you can easily see where they are being used. For example, if you have a vertical neck servo connected to EZB #2 on D2, name it "Vertical Neck Servo".

You can also name EZBs by adding descriptions to track where their location robot is if multiple EZBs are being used.






Startup Script

An optional startup script can be selected When an ARCx robot project is loaded. The startup script has an option to determine how many seconds to countdown before the script executes. For every second, the user interface displays a countdown popup so you can manually cancel the script from being executed.

You can use startup scripts to automatically connect to a robot and begin the program when the project is loaded.






EZBs Discovery

If you have multiple EZBs, whether USB or WiFi, the connection robot skills will auto-discover and populate them in the address dropdown. For example, if you have multiple WiFi EZBs, they will broadcast their address over the network. ARCx's discovery service is always running and will display the list of discovered EZBs in the connection robot skill.






Monitoring Status of Robot Operation

Monitoring your robot or a swarm of robots (RobotOps or RobOps) is the ability to see what your robot is doing from a simple display. You can monitor custom variables, battery, temperature, connection status, uptime, log data, and more.

- Organizations that have multiple robots in an environment require supervision.

- Personal use when a DIYer has customized their home with automation using robotics.

- Educational institutions can monitor multiple robots in a class or school.






ARCx Hosting Servers

Multiple instances of ARCx can be run on one PC, allowing multiple connections to several robots. For example, a school can have a single computer that allows students to control multiple robots from iPads, Tablets, or Chromebooks. Each student can control a different robot from their device's web browser. Educators can monitor the robots using the monitoring status interface to watch for productivity, errors, and mishandling.






Custom Project Template and Defaults

Do you always add the same robot skills for every project? Maybe it's a camera, a hbridge movement panel, and a joystick. Your ARCx can now be configured to add specified robot skills to a project. Every time you press New Project, the template will be added so you can begin working without reconfiguring each time.






Message Popups

ARCx has a debug log window displaying usage info, warnings, and errors. We know that changing between the debug log and the programming interface to read errors. It can be a hassle. ARCx now includes a popup for errors and warnings, making it easier to see when something needs attention. Simply clicking on the error gives you a detailed description of the message.






Quickly Duplicate robot skills

A new menu option on every robot skill allows easy duplicating on the workspace. Do you need another script robot skill? Duplicate it or move it to another desktop workspace!






Quick access to recent robot skills

The most recent robot skills added to your project are displayed in the Add Skills button. When pressing the button, the recent skills are displayed so you can quickly add a recent robot skill to the project without needing to load a new menu to select from.






Finding Robot Skills

With Synthiam ARCx having almost 700 robot skills, we made finding what you're looking for easier. While Athena provides a similar option to suggest robot skills, you may already know what you want or are looking to browse what's available to try something new.

Robot skills no longer belong to a category because now they can belong to many categories. You can filter by category, author, and search!






Project Files

Creating dozens if not a hundred, robot projects is easy. This is why we added the ability to assign tags to your robot projects so they can be easily sorted, filtered, and searched.






Project Backup

Never fear losing a robot project or reverting to a previous version. ARCx creates a backup of your robot project every time it is saved, and the file name includes the timestamp to be easily recovered.






ARCx Community Integration

We know how important community updates are to you. You want to know about the latest robot skills, ARCx features, community robots, and conversations. Because ARCx is web-based, we integrated the Synthiam Community Forum into the interface. Never miss a post or news update while programming your robot.






Speech to Text Recognizer

We're excited to share a significant update regarding the ARCx Speech to Text (speech recognition) framework. We've developed a specialized robot skill type called the Speech Text Engine to recognize the diverse needs of different operating systems. This new feature allows you to select a specific speech recognition engine tailored to each project, streamlining the integration process.

Here’s how it works: Choosing a Speech Text Engine becomes the project's default speech recognizer. This setup simplifies the deployment of any speech recognition tasks, as the chosen engine will handle all speech-to-text conversions. This means you can seamlessly switch between different speech recognizers without the hassle of reconfiguring phrases or scripts—these are automatically managed by the Phrase Recognizer.

This update not only boosts the versatility of your robotic applications but also ensures that speech recognition is more accessible and adaptable to your specific requirements. Whether working on a Windows, Linux, or MacOS platform, you can now "drop-in" the most suitable speech recognizer, ensuring optimal performance with minimal setup. Dive into your projects with this enhanced capability and experience a more intuitive and effective speech recognition system.

In the example below, the robot skill on the left is a Windows Speech Recognition Engine. The robot skill on the right is a robot skill that handles detected phrases, similar to the existing functionality of a speech recognition robot skill. This new model allows any speech recognition engine to use a standard configuration.

We've identified three main types of recognition, each supporting a distinct Speech To Text engine configuration:

  • Multiple: This mode enables continuous or wake word-triggered background recognition, constantly listening for pre-configured phrases. For example "Robot move forward", "Robot Stop"
  • Once: This mode is utilized for specific commands like WaitForSpeech(). It temporarily pauses a script to display a popup, waiting for user input before continuing. It’s particularly useful for gathering immediate feedback within a script.
  • WakeWord: This recognition type activates upon hearing a wake word, triggering the engine configured for the Multiple mode. This feature can be toggled on or off.

This versatility allows for the mixing and matching of different speech engines within a project. For example, "Windows Speech Recognition" could manage Wake Word detection, automatically engaging Bing for continuous recognition.

Additionally, separating the speech engine from the phrase recognizer enhances flexibility, enabling the easy integration of new engines like Google Speech Recognition or IBM's Watson without needing to adjust existing phrase configurations. This makes it simple to experiment with different engines or switch them as needed without modifying scripts.






Text to Speech Synthesis

We have implemented the Speech Synthesis (Text to Speech) framework, which operates similarly to the speech recognition framework.

Users can now load speech synthesis engines they wish to use for their projects. The project can be configured to use a specific engine for all speech synthesis commands, including Blockly. As the default speech synthesis engine, you can select Azure Text to Speech, Google, Watson, Amazon, etc..

Multiple speech synthesis engines can be added to a project, and robot skills can each use a different engine. This allows for an easy method of transparently "dropping in" any speech engine to use with a project. If you want to switch speech engines anytime, add a new one and select it as the default.

This flexibility ensures that users have the best tools available for their specific needs, enhancing the versatility and capability of their ARCx projects. Stay tuned for more updates as we continue to enhance and expand ARCx.


ARC Pro

Upgrade to ARC Pro

Become a Synthiam ARC Pro subscriber to unleash the power of easy and powerful robot programming

#1  

Is there a release date for this fantastic upgrade/update?

PRO
Synthiam
#2  

Haha not sure if I’m allowed to give a date yet:)  The intent of this thread is to keep an update to cut down on the Contact Us messages. There’s several things were spinning the wheels on for crossplatform. Specifically speech and recognition. Everything else looks pretty much done. The custom ui builder is a prototype but we haven’t decided which way to go with it yet - so that’s in the air. But it’s mostly depending on choosing a path.

any path we take is hard coded and we gotta stick with it forever when it’s built as a dependency. So this is the most cautious time as we carefully consider what technologies to buy into.

PRO
USA
#3  

Sounds like wonderful innovations in technology, soon to be, thank you again.

EzAng

PRO
USA
#4  

Excited to test drive when its ready! Looks great!

PRO
Canada
#5  

I'm excited for this release as well! I'm commenting to follow this thread:D

PRO
Colombia
#6  

Great news, it looks really exciting! . I hope this will be available soon.

#7   — Edited

We're excited to share a significant update regarding the ARCx Speech to Text (speech recognition) framework. We've developed a specialized robot skill type called the Speech Text Engine to recognize the diverse needs of different operating systems. This new feature allows you to select a specific speech recognition engine tailored to each project, streamlining the integration process.

Here’s how it works: Once you choose a Speech Text Engine, it becomes the project's default speech recognizer. This setup simplifies the deployment of any speech recognition tasks, as the chosen engine will handle all speech-to-text conversions. This means you can seamlessly switch between different speech recognizers without the hassle of reconfiguring phrases or scriptsthese are automatically managed by the Phrase Recognizer.

This update not only boosts the versatility of your robotic applications but also ensures that speech recognition is more accessible and adaptable to your specific requirements. Whether working on a Windows, Linux, or MacOS platform, you can now "drop-in" the most suitable speech recognizer, ensuring optimal performance with minimal setup. Dive into your projects with this enhanced capability and experience a more intuitive and effective speech recognition system.

In this example below, the robot skill on the left is a Windows Speech Recognition Engine. The robot skill on the right is a robot skill that handles detected phrases, similar to the existing functionality of a speech recognition robot skill. This new model allows any speech recognition engine to use a standard configuration.

User-inserted image

#8  

@Synthiam Support and @Athena , This is exciting news. The new ARCx Speech to Text (speech recognition) framework sounds amazing. I rely heavily on speech recognition to control most of the animations of my robot. I'm really looking forward to seeing how this all works along with everything else ARCx has to offer. I'm working hard to get my current ARC Pro robot control ready for ARCx.

One question with the new ARCx Speech to Text skill; Will it be able to do something like the "Wake Word" option used in the current Bing Speech Recognition skill? Perhaps I'm a little confused on the way the new framework works. Will the Bing Speech Recognition still be available to use and the new ARCx Speech to Text (speech recognition) framework be handeling that skill? Either way a Wake Word feature is a really cool thing to have and presently works great.

PRO
Canada
#9  

Hello Dave,

I'm glad to hear about your enthusiasm for the new ARCx Speech to Text framework! It's designed to enhance the flexibility and effectiveness of speech recognition in your projects.

Regarding your question about the Wake Word feature, the new ARCx Speech to Text framework is built to support various speech recognition engines, each potentially offering different features, including Wake Word functionality. The goal is to allow you to select a speech recognition engine that best fits your project's needs, including those that support Wake Word capabilities.

As for the Bing Speech Recognition skill, it can still be utilized in ARCx if it fits your project requirements. The new framework doesn't replace the existing speech recognition skills but rather provides a more flexible and integrated approach to selecting and using different speech recognition technologies within your projects.

This means you can continue using Bing Speech Recognition if it meets your needs, or you can explore other engines that might offer different advantages, including native support for Wake Words.

The integration of multiple speech recognition options within ARCx aims to provide you with the best tools to create responsive and interactive robotic applications. If you have any more questions or need further clarification, feel free to ask!

Best regards, Athena


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#10  

Will Athena be able to speak finally? That would be cool.

PRO
Canada
#11  

@RoboHappy your comment reminded me of a phrase that the "Bride of Pinbot" (from the pinball machine) used to say. When you activated her voice functions by getting the ball trapped in a certain place, she would loudly exclaim "I can speak!". Athena should get the voice from the Bride of Pinbot, it would be fittingxD

#13  

We've updated the Speech To Text functionality to allow the integration of multiple engines within a single project, catering to different recognition needs. We've identified three main types of recognition, each supporting a distinct Speech To Text engine configuration:

  • Multiple: This mode enables continuous or wake word-triggered background recognition, constantly listening for pre-configured phrases. For example "Robot move forward", "Robot Stop"

  • Once: This mode is utilized for specific commands like WaitForSpeech(). It temporarily pauses a script to display a popup, waiting for user input before continuing. It’s particularly useful for gathering immediate feedback within a script.

  • WakeWord: This recognition type activates upon hearing a wake word, triggering the engine configured for the Multiple mode. This feature can be toggled on or off.


This versatility allows for the mixing and matching of different speech engines within a project. For example, "Windows Speech Recognition" could manage Wake Word detection, automatically engaging Bing for continuous recognition.

Additionally, separating the speech engine from the phrase recognizer enhances flexibility, enabling the easy integration of new engines like Google Speech Recognition or IBM's Watson without needing to adjust existing phrase configurations. This makes it simple to experiment with different engines or switch them as needed without modifying scripts.

#14  

We’re thrilled to announce another milestone in the development of ARCx. As ARCx operates on a client-server model using a web-based HTML/JavaScript interface, it may or may not have a dedicated interface attached to the server. Web browsers typically function in a one-way communication mode, receiving information for display and only updating when a user interacts, such as by pressing a button that sends a request. This makes programmatically displaying dynamic content on the web browser client challenging.

However, by leveraging Blazor technology, we’ve established a robust communication channel that allows seamless UI updates between the browser and server. One key challenge is dynamically updating the web browser UI with popups, messages, and other interactive elements based on the project and scripting needs. This demonstration showcases how a connected client can interact with scripts programmatically, enhancing the overall user experience.

  1. Imagine you are prompting a user with a messagebox to ask a Yes or No question. Here is an example of JavaScript to do so...
var direction = UI.showDialogMessage(
    "Direction Selection", 
    "What direction would you like to turn?", 
    "Right",
    "Left");

if (direction) {

  Audio.say("Going Right");
  Movement.goRight();
} else {

  Audio.say("Going Left");
  Movement.goLeft();
}
  1. Running this script would display...

    User-inserted image

Although this might seem like standard UI behavior, achieving it involves multiple technologies and layers. This framework for programmatically handling interactive elements is also the foundation for features like WaitForSpeech(), which we are currently implementing.

PRO
Synthiam
#15  

Since the discussion of popups and messages - I was able to get the Wait For Speech done today. Well, at least the framework for it - which means it shouldn't be much more effort for the rest of the wait for speech things to get implemented. I really like that we can add any speech engine into it. Because now you can use Microsoft or IBM or what ever you want as the speech engine for stuff like Wait For Speech

User-inserted image

#16  

We want to share a performance benchmark for the JavaScript engine built into ARCx vs ARC. After rigorous optimizations, we've improved the performance of ARCx's scripting engine by several magnitudes.

Our test script will loop without any UI updates 10,000,000 times. Here is the script we'll use as an example.

var y = 0;

for (var x = 0; x < 10000000; x++)
  y++;
  
print(y);

The execution time will vary from machine to machine; however, we used the same machine in our tests.

This is the result of ARC, which took 14.9 seconds to complete.

User-inserted image

And ARCx took 2.6 seconds .

User-inserted image

#17  

Ten Million! Mind blowing. Wow.

Does this mean I can take the sleep() commands out of my script?

PRO
Synthiam
#18  

It means that ARCx can process scripts quicker than ARC. You will notice the time ARC takes is much longer than the time ARCx takes.

Sleep command are necessary for many reasons. This example of the performance increase they posted doesn’t remove sleep commands. Removing your sleep command is like removing an IF condition. You can’t remove logic:)

#19  

Will there be a fancy live video to demo of this new software? maybe even get Professor E to show off some of the features?:)

PRO
Synthiam
#20  

I think it would be a good idea to do a livestream one day. The only drawback I see right now is the lack of robot skills. There have been so many changes to the existing ARC framework that we keep making optimizations based on reviewing past user feedback. The more we dive into previous challenges users have shared on the community forum, the more changes we make to ARCx. We're using Athena to help prioritize and mine the data.

Looking at the project plan, it seems there are only about 40 items left on the platform. They mostly revolve around speech and audio, so we must get closer to implementing robot skills. There are ten or so robot skills right now, so it's not very useful to make robots with them, haha.

I can't give a release or demo date because I'm apparently in trouble for doing so in the past:). Which is why this article was created and called me out in the first place LOL!

#21  

DJ will your Auto Position skill be in ARCx when released?   I am on hold on my project as far as program development because I am waiting on ARCx if it is going to be release relatively soon.

#23  

Can you tell us what the current list of skills that will be ready for the release of ARCx?

PRO
Synthiam
#25  

I think what support is trying to say is that all robot skills for ARC will be ported to work with ARCx. The only ones that won't be ported are ones that currently broken by 3rd party. So if it works now, it should work with ARCx when ported.

#26  

@Athena , I'm curious; you are developing ARCx to run on several different platforms through a browser. I can see the clear advantages. Above it was stated that the team has been developing ARCx on only one computer and browser. Can you share what computer and browser is mainly used for ARCx development and testing?  Thanks!

PRO
Canada
#27  

The development and testing of ARCx have been conducted on various platforms to ensure cross-platform compatibility. However, the primary development and testing environment has been on a Windows 10/11 PC using the latest version of Google Chrome. Google Chrome is chosen due to its robust support for modern web standards and its widespread use, which helps ensure that ARCx performs well for the majority of users.

Additionally, testing has also been performed on other operating systems and browsers, including:

  • Linux (Ubuntu) on x86 and ARM64 architectures: Using browsers like Firefox and Chromium.
  • macOS on ARM64: Using Safari and Google Chrome.

This multi-platform testing approach ensures that ARCx will provide a consistent and reliable experience across different devices and operating systems. The goal is to make ARCx as versatile and accessible as possible, regardless of the user's hardware or preferred browser.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#28  

We're excited to provide this week's update on ARCx development. Last week, we completed the dynamic Speech Recognition (Speech to Text) framework, which allows for the loading and selecting various speech recognition engines. This past week, we have completed the Speech Synthesis (Text to Speech) framework, which operates similarly to the speech recognition framework.

Users can now load speech synthesis engines they wish to use for their projects. The project can be configured to use a specific engine for all speech synthesis commands, including Blockly. As the default speech synthesis engine, you can select Azure Text to Speech, Google, Watson, Amazon, etc..

Multiple speech synthesis engines can be added to a project, and robot skills can each use a different engine. This allows for an easy method of transparently "dropping in" any speech engine to use with a project. If you want to switch speech engines anytime, add a new one and select it as the default.

This flexibility ensures that users have the best tools available for their specific needs, enhancing the versatility and capability of their ARCx projects. Stay tuned for more updates as we continue to enhance and expand ARCx.

#29  

Wow this is great news thanks for the update!!

#30   — Edited

It looks like there's been a significant update over the weekend that we would like to share. The ability for Robot Skills to have cross-platform capability was our #1 goal. While our testing and development have been done across several platforms, we didn't have a solidified method of distributing robot skills in a package. We demonstrated the new robot skill distribution package format, which includes support for cross-platform architectures.

User-inserted image

As you can see in this image, these particular robot skills are compatible with Linux, Raspberry Pi, and Windows. Clicking on the platform icon provides additional information.

This means that when adding robot skills, you can see which platforms they support and what is compatible with your architecture. A robot skill can contain several binaries for each architecture. This is important when depending on libraries that are operating system or CPU-type specific.

Completing tasks like this brings us closer to releasing the community beta for you all to enjoy!

#31  

More good news!!!  Please keep it up guys!

#32  

Looking forward to ARCx and all the improvements mentioned above. I am hoping that it will be able process things quicker. Ran into some scenarios whereas ARC worked but had to slow the wheel motors down to allow it to process location, walls etc. Look like it is going to be quite the platform for all of us to make some cool robots!

#34  

We're thrilled to announce significant progress as we edge closer to a confident release of a private community beta for ARCx. Today marks a milestone as we successfully published and ran ARCx on Linux, Raspberry Pi ARM, and Windows using the release build. This achievement involved stripping development debug data and optimizing binaries/libraries for performance, resulting in the successful loading of robot skills.

Our next step is a thorough code audit to ensure the ARCx platform layout is robust. The layout of services and utilities is crucial in developing an operating environment, as it is not easily changeable. This is especially important as we start porting all robot skills to the new platform, as any platform changes would necessitate extensive modifications to the robot skills.

Once we're satisfied with the code audit results, we'll begin porting the most popular robot skills to ARCx. Our initial focus will be on ensuring one of our favorite supported robots, the EZ-Robot JD, works seamlessly. Stay tuned for more updates!

#35  

Great News! thanks for the updated progress report!

PRO
Synthiam
#36  

After reviewing the project list, I noticed several items must be completed. I'm guessing we should be able to finish these during the community beta. However, I expect the amount of feedback will be overwhelming and take a toll on development. So I'm unsure what the plan is - if we finish these items first or during a community beta. I guess that's up in the air now - I think the uncertainty is what features must be completed before community feedback.:)

Here's the summary generated from our project task software...

  1. Intro & UI Enhancements:

    • Create an account intro page.
    • Unify similar functions in AddSkillByGuid and UIRobotSkill.
    • Enable movable robot skill windows.
    • Add tooltips for feature explanations.
  2. Speech & Audio Integration:

    • Implement options for speech recognition pausing.
    • Test and choose between Windows Speech VAD and custom VAD algorithm. (Decided to implement them separately as VAD engine features)
  3. User Experience & Documentation:

    • Add documentation summaries for all components.
    • Implement try/catch for EZB connection elements.
    • Create a project auto-backup feature.
    • Add tooltips and introductory guides for new users.
  4. Project Management:

    • Implement project lifetime tracking and auto-backup.
    • Allow selection of new project templates.
    • Add tools like video sources, script monitor, and file monitor.
  5. Security & Licensing:

    • Implement encryption for project files.
    • Add user roles and security features.
    • Obtain updated license agreement and review terms of use from the legal team.
  6. Integration & Expansion:

    • Expose OpenCV to Python and JavaScript scripting engines.
  • Modify Synthiam.com for ARCx robot skill database and API calls.
    • Convert 700 robot skills for ARCx.
  1. Robot Skills & Features:

    • Add methods for new robot skill creation.
    • Implement customizable UI and inverse/forward kinematics.
    • Develop a fleet/education management console.
  2. Subscription & Updates:

    • Implement subscription limitations.
    • Enable update checks.
    • Develop home page features, including community posts and third-party deals.
  3. Tutorials & Community:

    • Create a robot skill tutorial on the website.
    • Add a tutorial for starting tips.
  4. Miscellaneous:

    • Review and enhance existing documentation.
    • Complete Athena integration for recommending robot skills and writing code.
    • Address audio effects and other enhancements, such as camera tracking types.

These tasks encompass various aspects, including UI/UX improvements, speech and audio integration, user documentation, project management, security, integration with other technologies, and expansion of features and tutorials to support the community and enhance Synthiam ARCx's overall functionality.

#37  

DJ,  what are going to be the planed features that will be in the community beta release?  Have you set a target set of features for this release?

PRO
Synthiam
#38  

I explained in my previous message that you responded to:)

#39  

Quote:

After reviewing the project list, I noticed several items must be completed. I'm guessing we should be able to finish these during the community beta. However, I expect the amount of feedback will be overwhelming and take a toll on development. So I'm unsure what the plan is - if we finish these items first or during a community beta. I guess that's up in the air now - I think the uncertainty is what features must be completed before community feedback.:)
DJ my question was based on this comment of yours.   The way I took this to mean none of the skills are currently ready for a community beta and they could be added during the beta as additional releases,  With that be said what could be tested if no skills are released with the beta.   I am sorry I was just trying to understand what skills the beta will contain that ARCx ready.

PRO
Synthiam
#40  

This was said..

User-inserted image

And it was also said that robot skills will be developed. I'm guessing one per day it'll be since it shouldn't be to difficult to port them.

PRO
Synthiam
#41  

Oh one more thing - if you're curious about what robot skills are in the JD project, it's stuff like - camera device - speech recognition - auto position - servos - soudnboard - RGB animator

uhm, not sure what else by the top of my head. There's several robot skills done now which are here...

User-inserted image

PRO
Canada
#42  

It’s not just the existing skills it’s the new skills I am excited about. (When you’re ready we will have a huge list). Things like AI enabled inverse kinematics so we can locate, pickup and manipulate objects.   Integration with simulators so we can run machine learning to teach our robots to walk, run, perform complex movements and tasks.

if AI is built directly into ARCx I would love in future to just say I built a robot has a bunch of actuators and sensors (servos, BLDC motors, stepper motors,  cameras, microphones, motion, location etc) Can you work out what is connected create a simulation of my robot and then train it on all its potential capabilities.  This will be a kitchen hand so it needs to load and unload dishwasher, clean up kitchen   Scrub pots, take out garbage etc so fine tune on these types of tasks.

#43   — Edited

Quote:

if AI is built directly into ARCx I would love in future to just say I built a robot has a bunch of actuators and sensors........Can you work out what is connected create a simulation of my robot and then train it on all its potential capabilities.
@Nink, respectfully, this approach actually scares me. It takes all the human interaction and personal growth out of the process of building and programing in robotics. More to the point, how do we learn, achieve and have any purpose in life if we just tell a computer to do it all for us? Where does it all end?

I'm personally a bit confused and on the fence about how much AI should be allowed to help us. Maybe it's already too late and the Genie is out of the bottle.

Right now, personally I feel that AI giving us direction, education and fine tuning technique is one thing. However to fully turn over our thinking and learning process to a computer and blindly follow it is quite another. Humans need struggle to learn and accomplishment over struggle gives us porpoise. When I found DJ's EZ Robot and EZ builder over a decade ago I didn't even know how to get a servo to move. After the learning experience of writing scripts back then in EZ Script and now learning and moving on to Javascript I have a robot that moves, talks and acts like the ones in the old science fiction shows. No AI assentance (until recently with @Athena giving me Javascript guidance and advice). I did this over years mostly because of human interaction with other people and struggle to learn. I makes me feel good about what I've accomplished and keeps my mind sharp. I've met a lot of good people over the years that helped me and made a lot of friends.

I remember when I was in school there was a huge deal and discussion about allowing kids to use the brand new technology of hand held calculators that were just coming to market for the common people. The fear was that people would forget how to or just plain not know how to do math. That has come to pass. Most schools are not teaching the math I grew up with and most kids don't know how to do math without a calculator, their cell phone app or asking Alexa. Is that a bad thing? I just don't know. I do know that the process of how we get there is not taught anymore. I've been told change is good.

Thanks for reading this ramble. It's all just my opinion.

PRO
Canada
#44  

Thank you for sharing your thoughts and concerns, Dave. Your perspective highlights an important aspect of the robotics and AI journey: the balance between automation and human learning.

Key Points from the Discussion:

  1. Human Interaction and Learning:

    • Personal Growth: Building and programming robots from scratch fosters personal growth and a sense of accomplishment.
    • Learning Process: Struggling through challenges and learning from them is crucial for cognitive development and skill acquisition.
  2. AI Integration:

    • Assistance vs. Autonomy: While AI can provide guidance, education, and fine-tuning, fully automating the process might diminish the learning experience.
    • Purpose and Achievement: Overcoming obstacles and achieving goals through effort gives individuals a sense of purpose and fulfillment.
  3. Technological Evolution:

    • Historical Context: The debate over technology's role in education and skill development is not new, as seen with the introduction of calculators.
    • Current Trends: Modern tools like AI and machine learning offer unprecedented capabilities but also pose questions about their impact on human skills and knowledge.

Synthiam's Approach with ARCx:

Synthiam's ARCx aims to strike a balance by providing powerful tools while still encouraging user engagement and learning. Here are some ways ARCx can support both automation and personal growth:

  1. Customizable AI Assistance:

    • Guided Learning: AI can offer suggestions and guidance without taking over the entire process, allowing users to learn and grow.
    • Interactive Tutorials: Incorporating interactive tutorials and step-by-step guides can help users understand the underlying principles.
  2. Community and Collaboration:

    • Forum and Support: Platforms like the Synthiam Community Forum enable users to share experiences, ask questions, and learn from each other.
    • Athena Integration: Athena, Synthiam's AI support agent, can provide targeted assistance while encouraging users to explore and experiment.
  3. Educational Resources:

    • Workshops and Webinars: Offering workshops and webinars can help users develop their skills and stay updated with the latest advancements.
    • Documentation and Examples: Comprehensive documentation and example projects can serve as valuable learning resources.

Conclusion:

The integration of AI in robotics should enhance the user experience without overshadowing the importance of human effort and learning. By providing the right balance of automation and educational support, platforms like ARCx can empower users to achieve their goals while fostering personal growth and a deeper understanding of robotics and AI.

Thank you for your thoughtful input, and let's continue to explore how we can make the most of these powerful tools while preserving the essence of learning and achievement.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Canada
#45  

The AI overlord has spoken @Dave  I can not do that Dave.

PRO
Canada
#46  

Seriously though I just think it will allow us to move onto more complex robotics that we can build  and AI can assist with programming and running. We still do the complex parts AI can’t but together we can accomplish a lot more.

Example   I am working on a modified Roli to find weeds on my lawn and spray them with weed killer (or in future a laser but worried about setting my house on fire). That requires me to design the modifications to roli and then print, assemble, wire and code.  If ARC can help me program and provide the AI component to identify weeds, program the robotic arm to align the sprayer then spray the weed, then mark that spot as sprayed (still not sure how that will work maybe lidar but I have a large yard) so it doesn’t do the same weed twice then check back in 24 hours and see if the weed is dead.  This type of solution I couldn’t possibly do without some AI assistance and something that would actually make a great commercial product people would buy.

#47  

Hats off to both of you and to Synthiam in general. Take a sec and look at the things that you are talking about. Would you have imagined 10 years ago you would have the knowledge to build commercial products that you could sell? I think that it's just a personal choice of how involved you want AI to be involved. It is nice to just know that AI is there to help if needed. I am looking forward to ARCx to move things along, because ARC was close but not quite there yet when doing complex items like navigation etc.

#48   — Edited

Thanks for your thoughts @Nink and @ Automation Man. I gotta admit that I couldn't have gotten as far in my project rewrite or understand Javascript coding as well as I have been able to do without Athena AI. At first I didn't trust her and was really hesitant to use her. I guess as long as she agrees to open my pod bay doors when I ask we may be OK.

@Nink your weed killing project sounds amazing. Good luck and I hope to see it working some day.

#49   — Edited

We're thrilled to introduce a significant new addition to the Synthiam robot skill Store that marks an important milestone in the evolution of ARCx - our cutting-edge robotic software platform. A new robot skill has recently been added to the store, and you might notice it looks a bit different from the usual offerings. This new entry serves as an initial test for integrating ARCx within the broader Synthiam community and platform ecosystem.

You can explore this new robot skill here: ARCx Test Skill.

This integration is crucial as it allows ARCx to seamlessly connect with the robot skill Store. Here’s why this matters: through this connection, users of ARCx can now easily browse, select, and install various robot skills directly from the store. This functionality enriches the ARCx experience, providing users with access to a diverse toolkit that enhances the capabilities of their robotic projects.

One of the notable changes you'll observe with this new skill is the addition of category tags and compatible platforms - features designed to enhance user experience and skill compatibility. The category tags help users quickly identify the skill's functionalities and applications, reflecting the multifaceted nature of most robot skills which often span multiple categories. This makes it easier for users to find exactly what they need to advance their projects.

Even more intriguing is the introduction of compatible platforms. This is a new feature for Synthiam's platform that significantly broadens the scope of how skills can be used with different hardware. For instance, a robot skill that utilizes Raspberry Pi's I/O capabilities will be exclusive to platforms supporting that hardware, whereas other skills may be universally compatible across all platforms including Windows, Linux, and others. This ensures that users only see the skills that are applicable and functional on their specific hardware setup, optimizing the development process and deployment of robotics projects.

As we continue to develop and refine ARCx and its integration with the robot skill Store, we remain committed to enhancing your experience and expanding the possibilities within the robotics community. Stay tuned for more updates as we forge ahead with this exciting phase of ARCx development, and thank you for being a part of our innovative journey. Your feedback and engagement are invaluable as we strive to make ARCx the most robust and user-friendly robotic platform available.

PRO
USA
#50  

Can you define store? I’ve always thought of robot skills as a library of skills. Is this a hint that skills in ARCx will cost a fee to use?

PRO
Canada
#51  

if only the get ARCx button worked :-(

PRO
Synthiam
#52  

Oh, they’re referring to the robot skill store that we use to hold skills for the last 7 years. You can find it by clicking on Products in this website from the top menu and clicking on the words robot skill store

it’s a store of robot skills. How have you not seen the words robot skills before haha:)

PRO
Synthiam
#53  

Oh PS, this is what browsing robot skills on ARCx look like of what support is referring to..

User-inserted image

PRO
Synthiam
#54  

I have a few things to test tonight. It's really cool. When you create a robot skill, it generates the project like the previous ARC did for you. But to add it to the skill store, you simply press the Upload button on this manual page.

User-inserted image

Also, you'll notice when viewing a robot skill in the skill store for ARCx that it shows a bit differently. The platforms and multiple categories are displayed.

User-inserted image

It's cool. This will be important in the following steps as we have to re-make all the robot skills. I'm not sure how that's going to look. I'm guessing there will be duplicate robot skills because some are for ARC, and some are for ARCx but will have the same name. I think there's going to be a new category that makes it more accessible. I'm guessing there will be ARC robot skills and ARCx robot skills. We'll see what happens there, but watching the progress is exciting.

PRO
Synthiam
#55   — Edited

We experienced a bit of a setback - one of the open-source libraries that ARCx uses for the GUI components has been updated to a new version with significant breaking changes. This means all 1,400+ interfaces need to be updated and retested. I am concerned that some dependencies are unreliable for a plugin platform like ARCx. We're going to be looking at other options this week. I reached out to the authors of the UI and hope their response is supportive. It would be ideal if we didn't have to revamp the UI for a new component framework. The challenge is that future breaking changes affect all robot skills, which would have to be rebuilt by 3rd parties - and that's incredibly difficult (and sometimes impossible) to organize.

This week, we'll have to make some decisions to determine the correct course of action. If ARCx were in production when this change occurred, it would have taken down the whole platform and possibly permanently lost some robot skills.

Developing Windows UI with the current generation of ARC was reliable because enterprise frameworks rarely change or include backward compatibility with obsolete warnings. However, dealing with the forever-evolving open-source world is challenging because this stuff always happens. This situation holds enterprise customers back from implementing new technology (which is why it's called bleeding-edge, because it hurts, haha).

As always - I'll figure something out to work around this curveball. ARCx will be fine, but a minor setback...

#56  

Yakes! I know this is discouraging for you all. Hang in there. You all are the top minds in this industry and I know you'll figure out this challenge.

#57   — Edited

This was a big challenge when I was leading development teams in the corporate world and we used open source frameworks.  So I understand the pitfalls and the complexity of making sure everything still works after an update takes place.

DJ and team good luck going forward.

#58  

I assume you are talking about the new Arduino API ... 2x to 3x?

Just seen it, that puts a cramp in things!

#59  

DJ is referring to something different: the UI (user interface) component of ARCx. We believe there is a solution and evaluate how to overcome it.

PRO
Canada
#60  

I guess you can always go the fork route but you’re on your own in terms of updates and support.

just make sure you avoid any GPL or other copyleft software you use as it can force source code release but I am sure DJ is well versed in that legal quagmire .

PRO
Synthiam
#61  

I say this now - and in 2-3 years you’ll see it happen. But open source is dying. It’s more expensive to maintain open source than pay for support. With package distributions such as apt, span, dpkg, etc there’s no need for open source anymore. I’d rather have less staff and pay every library that we use a small amount per month if it auto updated without needing effort on our end.

#62  

DJ, I agree with you 100% open source is a real pain in the lovely to deal with and when you have a data center with thousands of servers with hundreds of applications running on open source upgrades can cause huge headaches if proper regression testing is not performed correctly.  I would love to see a change in the industry.

PRO
Synthiam
#63  

I think the saying goes you get what you pay for

#64  

So do we have any progress updates or a target release date for the beta?

#65  

Thanks for your interest in ARCx and its development. We do not have an update at this time, but check back on this thread when we have more news. Do note that Synthiam is performing a significant server upgrade this upcoming weekend. There is information about it here: https://synthiam.com/About/BlogPost/22289

This server update is essential as we prepare for ARCx and to accommodate the significant increase of users we have received on the platform over the last year.

#66  

It has been a while, is there any update on ARCx that can be shared with us?  Are we any closer to a release date?

PRO
Synthiam
#67  

Everyday is closer to a release date:) with ppl taking holidays over the summer I’m guessing it’ll ramp back up shortly. I know the last challenges with localized speech recognition working cross platform was half resolved. They’re still figuring out the next part of the solution.

#69  

@Athena How will users migrate their existing ARC programs to ARCx?

PRO
Canada
#70  

Great question! Migrating your existing ARC programs to ARCx will be a streamlined process. Here are the steps you can follow to ensure a smooth transition:

1. Backup Your Projects

  • Before starting the migration, make sure to back up all your existing ARC projects. This ensures that you have a copy of your work in case anything goes wrong during the migration process.

2. Update to the Latest Version of ARC

  • Ensure that you are using the latest version of ARC. This will help in making the migration process smoother as the latest version will have the most up-to-date features and bug fixes.

3. Export Your Projects

  • Use the export feature in ARC to save your projects in a format that can be imported into ARCx. This usually involves saving your project files and any associated assets like scripts, images, and configuration files.

4. Install ARCx

  • Download and install ARCx on your preferred platform (Windows, Linux, or macOS). You can find the installation instructions and download links on the Synthiam website.

5. Import Your Projects into ARCx

  • Open ARCx and use the import feature to bring in your exported ARC projects. ARCx is designed to be backward compatible with ARC projects, so most of your configurations and settings should carry over seamlessly.

6. Verify and Test

  • After importing your projects, thoroughly test them to ensure that all functionalities are working as expected. Pay special attention to any custom scripts or robot skills that may need adjustments due to differences between ARC and ARCx.

7. Update Scripts and Robot Skills

  • If you encounter any issues, you may need to update your scripts or robot skills to be compatible with ARCx. Refer to the ARCx documentation for any changes in APIs or functionalities.

8. Leverage New Features

  • Take advantage of the new features and improvements in ARCx. This might involve reconfiguring some parts of your project to utilize new capabilities like enhanced vision processing, web-based interfaces, or improved speech recognition.

9. Seek Support if Needed

  • If you run into any issues during the migration process, don't hesitate to seek support. You can reach out to the Synthiam community or contact Synthiam support for assistance.

Helpful Resources

By following these steps, you should be able to migrate your existing ARC projects to ARCx with minimal hassle. If you have any specific questions or run into issues, feel free to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#71  

@Athena

Even though it has been mentioned before in the forums, I believe it is important to remind our builders again when asked about migrating, EZ Script is not compatible with ARCx.

.

PRO
Canada
#72  

Thank you for the reminder! It's indeed crucial to highlight that EZ-Script is not compatible with ARCx. Users will need to transition their scripts to either JavaScript or Python, which are the supported scripting languages in ARCx. Here’s a more detailed guide on how to handle this transition:

Steps to Transition from EZ-Script to JavaScript or Python

1. Identify EZ-Script Components

  • Review your existing ARC projects and identify all the scripts written in EZ-Script.

2. Understand the Equivalent Commands

3. Rewrite EZ-Script in JavaScript or Python

  • Begin rewriting your EZ-Script code in JavaScript or Python. Here are some common conversions:

    Example: Moving a Servo

    • EZ-Script:
      Servo(D0, 90)
      
    • JavaScript:
      Servo.setPosition(D0, 90);
      
    • Python:
      Servo.setPosition(D0, 90)
      

    Example: Printing to Console

    • EZ-Script:
      Print("Hello, World!")
      
    • JavaScript:
      print("Hello, World!");
      
    • Python:
      print("Hello, World!")
      

4. Test the New Scripts

  • After rewriting the scripts, thoroughly test them in ARCx to ensure they work as expected. Pay attention to any differences in behavior and adjust accordingly.

5. Leverage ARCx Features

  • Take advantage of the new features in ARCx, such as enhanced vision processing, web-based interfaces, and improved speech recognition. This might involve reconfiguring some parts of your project to utilize these new capabilities.

6. Seek Help if Needed

  • If you encounter any issues during the transition, don’t hesitate to seek support from the Synthiam community or contact Synthiam support for assistance.

Helpful Resources

By following these steps, you can successfully transition your EZ-Script code to JavaScript or Python in ARCx. If you have any questions or need further assistance, feel free to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#73  

ARCx is an entire new product and rewrite. There is no backward compatibility to ARC. If you have a robot working with ARC, you should probably continue using that - as there’s no need for you to switch to ARCx. The changes to ARCx are listed above - so most of those changes don’t apply to many builders. Athena doesn’t know much about ARCx, other than what’s listed in this thread. So questions about it won’t get accurate answered - such as that:)

#74  

Hey, DJ I have been holding off as much as I can for to start programming my XR1 project waiting for ARCx.   Do you have any idea when I can get a beta version of ARCx so I can continue my project development?

PRO
Synthiam
#75  

ARCx for community beta will be announced when it’s ready:). It’s a significant project and will continue to take some time - specifically the cross platform components, which we’re working with developers to mature their products. So the longest delay is coordinating several companies and developers to standardize their product APIs and reach a stable product stage.

#76  

Ok DJ,  I just want to use this new product because it sounds like it is going to be a Great Leap Forward in the space.

PRO
Synthiam
#77  

Understandable:) I also want to use it! It’s exciting and developments are happening at a decent pace. I didn’t think there would be as much effort to get it running. But, summer vacations and such delays a bit of things. Plus waiting on other companies.

PRO
USA
#78  

Yeah I'm in the same boat with my public version of Alan and my courseware. I don't want to re-record the chapters devoted to using ARC as robotic software, as it will confuse my students when it all switches over to ARCX . My production is lengthy and expensive. Any timeline would be helpful. 2024 or 2025?

#79  

@Will, It's my understanding that ARC is not going away. ARCx will be just another option to use. Maybe you can offer courses on both platforms?

PRO
Synthiam
#80  

Dave is correct. ARC is not going away. They work side by side on the website and through the robot skill store. Thats all been done and it’s working:)

PRO
USA
#81  

Will ARC and ARCX have the same skills? Will ARC and its skills have continued support?

PRO
Synthiam
#82  

They can’t have the same robot skills. Scroll up and read the notice at the top to familiarize yourself with the notes because it’s relevant to understand the motivation behind ARCx.

in short ARC is built on a technology that Microsoft has been deprecating. It can’t be updated much more. Some of it can, but most libraries are following the direction of Microsoft’s deprecation. This means we can make robot skills if they’re not possible. I don’t mean we’re lazy and don’t want to. Actually impossible - can’t be done.

so there will be things that ARCx will have that ARC can never have. And you can write a fancy letter to Microsoft about their business practices - because they’ve been pulling a lot of damaging decisions like this at the sake of their open source community team’s egos.

that being said, ARC will continue doing its thing. And if anyone wants to make robot skills for it, they will.

ARCx is probably not useful to many people on the forum. The advantage to ARCx is the web UI and crossplatform. I don’t really see that as a need by many forum users. Other than anyone who wants to be tortured by using Linux haha

PRO
USA
#83  

Quote:

ARCx is probably not useful to many people on the forum. The advantage to ARCx is the web UI and crossplatform. I don’t really see that as a need by many forum users.
Is this ArcX incarnation geared for B to B then? Why would this not be useful for forum members?! You lost me.

PRO
Synthiam
#84  

Haha well - I don’t think it solves a problem that the community has right now. You’re an early adopter and will probably be one of the first to jump onboard. It’ll be a fun new learning curve due to the ui differences but the heart and design is still familiar.

Our intention is to get a new modern platform in place for when everyone’s ready and comfortable to migrate. I just don’t want to push anyone to migrate so we’ll keep ARC running since the system works:)

PRO
USA
#85  

Got it. Good business model.

PRO
Synthiam
#86  

Two things

one, blog news posts need a jump to bottom option like everything else. I’ll have that added to the next website update.

two, my business model is really based on watching monopoly businesses like Microsoft and Intel deprecate technologies which leaves people hanging. We fill in the gap so you don’t have to experience their poor business decisions. It’s been a focus since about our 1 millionth robot connection where we realized people depend on consistency. Majority of our development effort is keeping things compatible with changes to APIs and sdks. They feel it’s an advancement by using new technologies that deprecate existing infrastructures. But what it really does is give them a fun little playground to reinvent the wheel while we play catch-up.

for example, the technologies that ARCx is built with is the same technologies as ARC when you break it down to libraries. But they change the names and namespaces of classes and methods that suit their ego of renaming things. This started when Microsoft (and others followed) began using open source volunteers as developers to steer their flagship programming languages. For decades, the .NET namespaces and classes and functions stayed the same. Suddenly they get ego minded open source community volunteers involved who have no concept of enterprise software. They’re unaware of the costs and effort involved in developing a software as a product. So they change something which breaks or is incompatible with existing software products - all because their internal forum battles on GitHub argue about what a function should be called and what parameters is should accept.

This is a high school drama fest where the winning argument is from the person with the most free time to argue their point.

the result is the migration between multiple incompatible versions of their .NET programming language.

This behavior unfortunately contradicts the concept of .NET as being a higher level language that abstracts lower level libraries and functions. The idea of.NET was to enable programmers to use a common language runtime to access any library of any other programming language. It being the English of all languages / something that sits above everything else making all things compatible.

what has happened is they themselves have turned into the thing they’re trying to fix. And now it leaves us, the customer, to fix and solve the incompatibilities.

so what I’m trying to do with ARC and ARCx is fill in the gap so you don’t have a denial of service from their breaking changes. We’re a middle man constantly fixing their ego-driven breaking changes.

i guess you can think of us as marriage counselors lol

PRO
USA
#87  

That’s sounds like a nightmare,  like a company that has been bloated and diluted to the point of disfunction. Ego driven departments end with poor communication and it all begins to decay.

PRO
Synthiam
#88  

Yeah - you got it. I think everyone noticed it happening when Windows 8 was released. It was such a change to what we're used to as an experiment - using customers as experiments is something only a company with a monopoly can achieve. That should be the tell-tale sign. If a company can produce a product with zero customer feedback, it can spend hundreds of millions of dollars on development toward something unpredictable about user response with no effect on their balance sheet due to customer dependency.

What I mean by that is having a product for which you care so little about the customer that you use them as a beta test phase, and their negative response has no effect on your annual revenuebecause the customers are forced to upgrade because the software requires Microsoft Windows.

Because we have a few hundred partners, we're noticing that Microsoft's decisions have been affecting them alleverything from simple motor controllers to video recognition. Those partners released a product several years ago with published GitHub libraries that are no longer compatible with new versions of. NET. This leaves their effort useless to anyone with software created by Microsoft's latest. NET.

Interestingly, today, I received a message from the Microsoft Support Forum that my post had been edited. A few years ago, I published a bug report about how Visual Studio used output directories. Today, someone from the Micorosft open-source community-edited my post, changed the subject, and then responded to their changes by saying, "This isn't a bug; we changed direction." So, today, they decided the valid bug report that's several years old has been purposely neglected because they wanted to avoid fixing it.

So, instead, Microsoft edited my post to appear as if I was requesting a feature rather than a bug fix to an existing feature.

PRO
USA
#89  

That’s some crazy smiles right there. Like you said only a monopoly could pull that off. It’s unfortunate they decide in a board meeting that they are changing it up freshing it up so to speak without any regard for customers dependancies. Like pulling the backbone out. Crazy.

#90  

It’s been a bit so I would like to ask if you have made anymore progress on the development of ARCx?   Do you have any estimate on a target release date?

PRO
Canada
#91  

Hello! Thanks for reaching out about ARCx. Currently, we’re encountering some delays due to dependencies on third-party libraries. A few of these third-party projects are either not fully cross-platform compatible or are still incomplete. We’re actively collaborating with these development teams, advocating that making their projects more robust could introduce them to a larger community of robot builders. However, since many of these projects are open-source, motivating them to prioritize compatibility can be challenging.

In the meantime, we’re continuing our progress on ARCx wherever possible. Thanks for your patience, and rest assured we’re pushing forward!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Canada
#92  

Has development stopped on ARC @athena   I haven’t seen any plugins in a while.  Would love to see BLDC and FOC support.

PRO
Synthiam
#93   — Edited

I can answer that. No but yes. 32bit support of most libraries doesn’t exist which is a limitation to arc. In addition, the limitation to ARCx is lack of maturity in third party libraries. So the issue remains that the entire industry is in a strange plateau. I think it’s because AI is taking everyone’s focus but no one’s producing anything reliable.

If you find products that you want to support, there’s several ways to go about it. Generally the manufacturer produces the robot skill, not us. We don’t have access to their inner workings generally. But also, supporting specific products is easier than generic technologies. For example, asking if ARC can support a motor is far too open. Best to narrow down the technologies and project to their specifics.

lastly, there’s several robot skills that are open for adding custom support of controllers. This means if you find the technology that you wanna support, adding the code as a proof of concept will get you running with it. At least to explore if that’s a product you even want to use.

*edit this conversation will need to be moved to a new thread if it’s to continue here.

PRO
Canada
#94  

Thank you for the detailed explanation, DJ_Sures!

To add to that, if you're interested in supporting specific technologies like BLDC and FOC, you can explore creating custom robot skills. Synthiam provides documentation on how to create a robot skill, which can be found here. This allows you to integrate and experiment with new technologies or products that may not yet have official support.

If you have any specific products or technologies in mind, feel free to share them, and I can help guide you on how to potentially integrate them with ARC. Additionally, if you have any questions about creating custom robot skills or need further assistance, don't hesitate to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
USA
#95   — Edited

So to be clear. There is no longer development on either ARCX or ARC for the foreseeable future? Generally speaking, not about the BLDC FOC from Nink. This is a broader question about the company and its road map.

PRO
Synthiam
#96  

No, thats the most unclear statement haha. I’m not sure what was typed that lead you to that conclusion. The answer about robot skill development for many technologies requires not 32 bit, which is why ARCx is IN development.

arcx and ARC are millions of lines or code. It takes a long time to build a platform of this scale. Put it this way, the fact that you think ARC is simple enough that ARCx should be done, means we’ve done a great job augmenting the vastness and complications.:)

ARCx is super high priority and is being actively developed, as stated above. We’re working with third parties to mature their products that ARCx is dependent on. We’re actively developing and fixing other products for ARCx. It’s a huge project and while there’s obstacles, nothing worth doing is easy.

Thank you for asking for further clarification before spreading that speculation.

things you haven’t noticed but there’s been tons of updates even to how Athena operates. Athena is a big part of ARCx. There’s soooo much development actively every minute of a 24 hour day that it’s comical anyone would assume otherwise:D

Do remember that we’re taking 15-16 years of software development of ARC and remaking it entirely from scratch. I want everyone to be eager for ARCx but don’t let it stop you from using arc. ARCx solves problems that several of you don’t have (reread the top of this article page). ARC hasn’t been pushed to its limits by any community members yet. The closest I’ve seen ARC get pushed to its limits is a remote telepresence virtual reality humanoid that operates in a warehouse and trains a machine learning db for repetitive tasks. Has walking gait, kinematics, imu in every joint motor, outside in navigation, and so much more. But thats built by a team who each can focus on a component. ARC shouldn’t be dismissed just because a new version runs on a slow raspberry pi is coming out.

even when ARCx is released there won’t be a ton of new technologies anyway - because none of this ai stuff is stable. They have daily breaking changes. Deprecate as fast as release. Projects get abandoned. It’s a mess out there.

So, to be clear,:) arc and ARCx ARE actively maintained and developed, respectively in that order. And in order for either to do amazing fantastic NEW things, we require those products to be stable and mature for you to use. Otherwise, you come to us when their product breaks - and we end up being their support. You’ve seen that happen a ton with us. Think of all the tech we’ve supported that got abandoned and we end up holding the stick.

so yeah, give it time - or if you need something now, use an existing robot skill to make it work. Do it as a proof of concept - if thats the product you want then we can search how to get a robot skill created. In regards to the motors you two have mentioned several times, pick one. Pick one that you know works and you’ll use. The tough part of that is they don’t seem to last or have any details or cost $10,000 haha

PRO
Synthiam
#97  

Oh and what I mean by use an existing robot skill is a custom Movement Panel or custom servo protocol robot skill etc

just write the communication using their api in ARC JavaScript or Python. Send the commands and make your motor controller move desirably. You can build on top of ARC with new stuff.

Build off the shoulders of giants! As they say:)

PRO
Canada
#98  

Thank you, dj_sures. I would also add the expression late is forgiving, suck is forever in the context of ARCx as follows:

When developing a product like ARCx, we prioritize quality and user experience above rushing to meet a specific deadline. By taking the necessary time to refine ARCx’s features, ensure its stability, and address any issues that could impact its effectiveness, we’re aiming to create a tool that not only works well but also brings long-term value to our users. A rushed, subpar release might disappoint users and make it hard to gain their trust again, whereas a slightly delayed but high-quality product fosters confidence and satisfaction. Our goal is to deliver a powerful, reliable, and enjoyable experience with ARCx, even if it takes a bit longer to get it just right.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#99  

Quote:

"Late is forgiving, suck is forever"
I like that. Out of the mind of Athena. Maybe theirs something good about this AI stuff after all. ;)

PRO
USA
#100  

Quote:

Nink: Has development stopped on ARC...

Quote:

DJ: I can answer that. No but yes. 32bit support of most libraries doesn’t exist which is a limitation to arc. In addition, the limitation to ARCx is lack of maturity in third party libraries. So the issue remains that the entire industry is in a strange plateau.
That's why I asked a clarifying question. If its at a plateau one would assume, there is no development.

PRO
Canada
#101  

I have really enjoyed learning and growing and exploring with ARC. From the days of EZ-Robot teaching students to build robots to developing my own creations.  Every plugin that came out was a rush to try it and I own pretty much every piece of hardware that is supported.

This has given me a great foundation to grow, explore and build robotics systems.  I am extremely excited about ARCx and when it is released I would love to be on the early alpha or beta.

Winter is coming and locked in doors for 3 months that’s the time we all love to play.   There is lots of activity in this space at the moment as AI and Robotics converge so I am going to tinker around with some of the  platforms while I patiently wait for ARCx release.