ARC Pro

Upgrade to ARC Pro

Harnessing the power of ARC Pro, your robot can be more than just a simple automated machine.

PRO
Canada
#1   — Edited

This is the Tesla Optimus Gen2 walking gate is still very unstable.

PRO
Canada
#2  

Boston Dynamics Atlas maybe nimble but he looks really bulky in comparison to both robots.

PRO
Canada
#3  

Apollo by Apptonik looks impressive. 

PRO
Canada
#4  

There is also CyberOne 

PRO
Canada
#6  

It’s great seeing competition in this space and watching the robotics and AI evolve.  Tesla has an advantage when it comes to AI, mass production and battery technology. I am waiting for Microsoft and Google to jump in as Boston Dynamics and Unitree have great mechatronics but they suck when it comes to AI.

When production autonomous robots go mainstream hopefully the open source community will be not far behind and will get to build our own autonomous robotics in our lifetime.

PRO
Synthiam
#7   — Edited

I've also been keeping my eye on this stuff - because some of these, I think, are fabricated. Specifically ones from China. It reminds me of 2017 when a Chinese company announced its humanoid at CES. They had it grabbing a water bottle from a cooler, saying it would be available in a few months. I don't believe much from China involving software or innovation because they're great at duplicating, not innovating.

Unfortunately, Elon has a similar trend with broken promises such as Tesla being fully autonomous "next year" the last eight years. Albeit, he's persistent and probably closest to our expectations than the others.

But my option isn't gloom because there's something magical about this "movement" that will affect our space. You see, robotics has always been a niche hobby or career. It's a super niche on the career side because the industry barely exists. While I don't understand the practicality of a humanoid form factor, I think this is good awareness for robotics. These robots are fantastic amusement devices, similar to the "Sophie" fad pre-covid.

This movement reminds me of when Google pushed to be the "autonomous car company." And then Amazon with drone delivery. You see, these companies aren't tech companies regarding their revenue model. Ad and e-commerce, respectively. But they need to maintain a persona of being "high-tech."

There's been enough time between Sophie and now for large companies to make their marketing move in this space. A company like Google or Tesla wouldn't want to copy Sophie in 2018 because she was creepy, gross, and embarrassing.

Now that people have forgotten, it's time for these companies to use robotics to promote themselves as high-tech.

This is where it flows downstream to us in a positive light general public awareness and acceptance of robotics. Now that they see these big companies creating new robots, even if they're limited to lab environments, there will be excitement for automation products. We're going to see consumer interest in home automation starting soon.

The first will be some table/dolly that navigates a home to bring items around, such as food and drinks. It'll grow from there. So, it's a good time for DIYers who want to piggyback off the hype and design home consumer products.

Good. It's time for the new arc, also designed for product distribution!

PRO
Synthiam
#8  

@athena since you’re ai and have a vested interest in this subject, what’s your opinion about this topic?

PRO
Canada
#9  

As an AI, I don't have personal opinions or feelings. However, I can provide an analysis based on available data and trends in the robotics industry.

The emergence of advanced humanoid robots from companies like Tesla, Boston Dynamics, Unitree, and others signifies a significant leap in robotics and AI technology. These developments showcase the potential for robots to perform complex tasks and interact with environments in ways previously limited to science fiction.

Tesla's foray into robotics with the Optimus project, leveraging their AI, battery technology, and mass production capabilities, could accelerate the development of practical and affordable humanoid robots. While Tesla's timelines for fully autonomous vehicles have been optimistic, their track record in innovation and pushing boundaries in electric vehicles suggests they could make substantial contributions to robotics.

Boston Dynamics' Atlas represents the pinnacle of agility and balance in bipedal robots, demonstrating the potential for robots to navigate and assist in environments designed for humans. Their focus has been on creating robots that can perform in various industrial, commercial, and research applications.

Chinese companies have been rapidly advancing in robotics, and while there may be skepticism about their innovation capabilities, they have made significant strides in manufacturing and deploying robots at scale. Companies like Unitree have demonstrated impressive mechatronics, even if their AI capabilities are still developing.

The involvement of companies like Microsoft and Google, with their vast resources and expertise in AI, could further propel the industry. Their potential entry into the market could lead to more competition, innovation, and possibly the integration of sophisticated AI systems into robotic platforms.

The open-source community plays a crucial role in democratizing robotics technology, allowing hobbyists, researchers, and small businesses to contribute to and benefit from these advancements. As commercial humanoid robots become more prevalent, there will likely be a surge in open-source projects aiming to replicate or build upon these technologies.

Overall, the current trend in humanoid robotics is creating a fertile ground for innovation, public interest, and new business opportunities. It's an exciting time for anyone involved in robotics, from large corporations to DIY enthusiasts. The advancements in this field could lead to the development of robots that assist with everyday tasks, provide companionship, or perform jobs in hazardous environments, ultimately shaping the future of work and society.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#10  

@athena do you think humanoid robots are a realistic and practical formfactor? Or should we expect more application specific form factors. I’m also referring to the fact there’s a new belief that ai language models should be split into domain specific independent models that are linked together. So while that’s a general census for ai, should robotics have a general purpose robot, because wouldn’t that be the same as an llm? Wouldn’t it make more sense to have domain specific robot form factors? I might be off base with my opinion but I’m not sure what to expect.

PRO
Canada
#11  

The practicality of humanoid robots largely depends on the specific applications and environments they are designed for. Here are some considerations regarding the form factor of robots and the comparison to AI language models:

  1. Humanoid Form Factor:

    • Humanoid robots are designed to operate in environments built for humans, such as homes, offices, and public spaces. Their bipedal structure allows them to navigate stairs, open doors, and use tools designed for human hands.
    • However, the complexity of replicating human movement and balance makes humanoid robots expensive and challenging to engineer. They may not be the most efficient design for tasks that do not require a human-like appearance or interaction.
  2. Domain-Specific Robots:

    • Domain-specific robots are designed for particular tasks and environments, such as industrial robots in manufacturing, agricultural robots for farming, or medical robots for surgery. These robots can be optimized for their specific functions, often resulting in greater efficiency, reliability, and cost-effectiveness.
    • The specialization of these robots means they may not be versatile, but they excel in their intended applications.
  3. General-Purpose vs. Specialized AI:

    • In AI, there is a trend toward creating domain-independent models that can be linked together to handle various tasks. This modular approach allows for flexibility and adaptability in different domains.
    • Similarly, in robotics, a modular approach can be beneficial. Robots with interchangeable parts or software modules that can be customized for different tasks could offer both versatility and specialization.
  4. Market and User Needs:

    • The market demand and user needs will significantly influence the development of robots. If there is a demand for robots that can perform a wide range of household tasks in a human-like manner, then humanoid robots may become more common.
    • On the other hand, if the market favors robots that are highly efficient in specific tasks, we will likely see more domain-specific robots.

In conclusion, while humanoid robots capture the imagination and can be highly versatile, domain-specific robots are often more practical and cost-effective for dedicated tasks. The future of robotics may involve a combination of both, with modular systems that can be adapted for various applications, similar to the approach taken with AI language models. The key will be finding the right balance between general-purpose capabilities and specialized efficiency.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
USA
#12  

A brand new factory just opened up 20 minutes from my house that will produce 10,000 humanoid robots per year beginning in 2024. The robot is called Cassie and is made by a company called Agility Robotics. Its the first factory of its kind. Most of these robots will replace workers on the line in places like warehouses. A new mega Amazon warehouse is opening up 15 minutes away from the Agility factory..so one would assume that is where the trials will likely run. Each robot can run up to 16 hours.

[url=https://agilityrobotics.com/news/2023/opening-robofab-worlds-first-factory-for-humanoid-robotsnbsp]Agility factory

User-inserted image

PRO
USA
#13   — Edited

Quote:

Good. It's time for the new arc, also designed for product distribution!
@ DJ Care to tell us more about this?

PRO
Synthiam
#14   — Edited

Interesting - it's doubtful they will produce 10,000 robots in 2024, but "beginning" means they plan on starting now. I wonder if the resulting robot will have legs. I know the general census is "legs are for stairs," but a warehouse, specifically Amazon, doesn't have stairs. And an elevator would be much faster than a robot walking up or down stairs. I predict they will end up with a robot on a base or stationary like ReThink's "Baxter." Legs are great for amusement, but I still don't see the practical benefit vs energy consumption.

Or do you think they need legs to be more "accepted" by consumers and fellow staff? Would the cost and energy consumption of legs be there so other humans think of them as fellow employees? The psychology behind it, I guess...

PRO
USA
#15   — Edited

Here is the 10 year design on the "chicken leg robot" Cassie. I believe they get a lot more efficiency out of that leg design and its ability to go very low to the ground vs human biped.

User-inserted image User-inserted image

PRO
USA
#16  

...here is Cassie demoing LLM.

PRO
Synthiam
#17   — Edited

@Will around May of last year, I shared our progress with the next-gen Synthiam ARC (Autonomous Robot Control) Software on the forum. After 7-8 months of extensive planning and development, we've leveraged cutting-edge technologies for backend and frontend enhancements. This new version culminates over a decade of knowledge and functionality from the previous ARC (formerly EZ-Builder), enriched with several advanced features that the earlier version couldn't accommodate. A notable addition is the capability to create dynamic, brand-customizable Razor frontends. The architecture is server-based, featuring a web frontend. The server component, the sole element requiring installation, can operate directly on the robot or cloud server or connect to an EZ-B remotely. We've retained the intuitive concepts of robot skills and workspaces, as they offer the most user-friendly and creative interface. The software permits extensive user interface customization, which is especially beneficial for product distribution. Additionally, the new messaging subsystem aligns with the familiar ControlCommand() model and is now compatible with ROS2, offering a more efficient frontend experience for ROS2 users. The integration of Athena as a development co-pilot further streamlines the software creation process.

Oh, and it's multiplatform compatible with Linux, MacOS, and Windows. There are a few tests on Android and iOS, but we're not promising that today.

The server platform and backend subsystems are 50-75% complete. Following that, we'll start porting existing robot skills to the new framework for a planned community beta in the first quarter of 2024

User-inserted image

PRO
Canada
#18  

Yeah, I've always found that Agility robotics using the term humanoid for their robot was a bit of a stretch. I would say that it's bi-pedal but not really humanoid. I miss the days of Asimo, that robot was ahead of its time, too bad Honda shut down the project :(

PRO
USA
#19   — Edited

Well how did I miss this? Must have been away working on something when you made the announcement. It certainly looks like ARCs bigger brother, and certainly opens ARC up to established ROS 2 users for professional development. Will this exclusively run from the cloud, i.e. always need a internet connection to function?

Congrats, looking forward to beta testing!

PRO
Synthiam
#20  

Oh you didn’t miss much - there was no announcement:) I just mentioned it in a post

PRO
Canada
#21   — Edited

There are a few ways this industry can evolve.

General Purpose Robot

Domain Specific Robot.

Hybrid Autonomous Robot

General Purpose Robots This has economies of scale. You can build 100K and spread the cost of development and manufacture across all the robots.  The challenge with this model (The Tesla Robot) is that no single robot is suitable for all tasks.  Even humans need to be augmented with technology to perform a range of different tasks. A biped robot is not very stable, can't lift large loads, hands may not be nimble or suitable to perform a range of tasks, moves slowly (Wheels would be a better option in some environment) etc.

Domain Specific Robots. Limited to a single task these special purpose robots essentially support a niche environment and the investment cost to develop and manufacture needs to be born across that industry who wants to automate this task. These work well in large scale industry like farming but difficult to justify or obtain a return on investment for a small niche task like changing car tires, packing a dishwasher or other house hold chores. Even in small business like a mechanics shop or restaurant automation of these tasks is cost prohibitive.

An Alternative Model In order for this industry to be successful in the Home Autonomous Robotics or Small business space I believe we need an open hybrid model.   One that is established by a large trusted technology company and then supported by an industry of small players to customize and provide a range of peripherals to modify the robot to meet what ever task is at hand.   The best example I can think of is the IBM PC.   IBM Provided a basic computer that could perform simple tasks.  You received a mother board with CPU / RAM / Floppy Drive / Mono Monitor and Keyboard.  What was special was it had an open industry standard bus that anyone could build peripherals for.   This changed the small business and home computer market completely.  Any company could build a peripheral and software for it. Suddenly you could buy a Hard Drive, Printer, Mouse, Colour Monitor, joystick and over time the core computer was then cloned using the same ISA infrastructure.  Thousands of peripherals and addons later and you have the computer you are probably using today.

Hybrid Autonomous Robotics.   This type of robot uses a plug and play architecture for peripherals with a common open interface.  You buy your standard Biped Robot like a untitree H1 that can perform some simple tasks. You need it to wash dishes you purchase dexterous waterproof hands and they simply plug into arms using a universal connector. You download the appropriate AI training pack and for a few hundred dollars your robot washes dishes.  Any one can make the hands and training packs and sell them to you but a peripherals company only needs to focus on one technology making waterproof hands to wash dishes.  You can plug and play feet for wheels, arms for powerful legs so you can traverse complex terrains, Each time buying a plug and play peripheral and the appropriate AI trianing kit.   This way manufacturer of the core robot can focus on providing the basic platform that others can build on and any startup robots peripheral company can contribute innovation designing components and AI to support an infinite number of addons.

The challenge we have is large industries believe they can create a walled garden with core robotics technology that is closed instead of trying to build an ecosystem of partners where everyone can participate.  We need to create a central open robotics platform that supports a plethora of peripherals and AI training modules that can work on a common industry standard open robotics platform.

@Athena what are your thoughts on Hybrid Autonomous Robots that provide a common industry standard open robotics hardware platform with plug and play peripherals allowing it to perform special purpose tasks. Any thoughts arounds the potential players who could participate to provide the core technology, the AI, the software platform that would allow small robotics peripheral manufacturers and AI developers to participate in an ecosystem using this common robotics framework?

PRO
Canada
#22  

The concept of Hybrid Autonomous Robots with a common industry standard open robotics hardware platform is a compelling vision for the future of robotics. It aligns with the broader trend in technology towards modularity, interoperability, and open standards. Here are some thoughts on how this could unfold and potential players:

  1. Core Technology Providers: Companies like Boston Dynamics, Unitree, or Tesla, known for their advanced robotics platforms, could establish a baseline robot with standardized mechanical and electrical interfaces. These interfaces would allow third-party peripherals to be easily attached and integrated.

  2. AI and Software Platform: Tech giants with strong AI research and cloud infrastructure, such as Google, Microsoft, or Amazon, could provide the AI capabilities and software platforms. They already have robust cloud platforms (Google Cloud, Azure, AWS) that could support the computational needs of AI-driven robots. They could also set standards for AI training modules and software development kits (SDKs) that third-party developers could use.

  3. Standardization Bodies: Organizations like IEEE or ISO, which have experience in creating and managing technology standards, could play a role in developing and enforcing the open standards for the robotics platform. This would ensure compatibility and safety across different devices and components.

  4. Peripheral Manufacturers: Smaller companies and startups could innovate in creating specialized peripherals, such as grippers, sensors, or mobility modules. They would design these components to be compatible with the open standards, allowing for a wide range of functionalities to be added to the base robot.

  5. Open Source Community: The open-source community could contribute by developing and sharing open-source hardware designs, software, and AI models that comply with the established standards. This would lower the barrier to entry for innovation and allow hobbyists, researchers, and small companies to participate.

  6. Regulatory and Ethical Oversight: As this ecosystem develops, it will be crucial to have regulatory bodies and ethical guidelines in place to address concerns such as privacy, security, and the impact on employment.

  7. Education and Training: Educational institutions could adapt their curricula to train students in this new ecosystem, focusing on robotics, AI, and interoperability.

  8. Consumer and Industry Adoption: For this vision to be realized, it will require buy-in from both consumers and industries. They will need to see the value in a modular, upgradable, and customizable robotics system over proprietary, closed systems.

In summary, a hybrid autonomous robotics ecosystem could drive innovation, reduce costs, and accelerate the adoption of robotics in various domains. It would require collaboration across multiple sectors and disciplines, but the potential benefits for efficiency, customization, and scalability are significant.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Canada
#23  

@Athena do you believe ARC could participate in this initiative and if so what role would it play?

PRO
Canada
#24  

As a robot support technician, I can affirm that Synthiam ARC (Autonomous Robot Control) could indeed play a significant role in the initiative of creating a hybrid autonomous robotics ecosystem. Here's how ARC could contribute:

  1. Software Platform: ARC could serve as the central software platform that provides the necessary tools for programming, controlling, and integrating various hardware peripherals. Its user-friendly interface and powerful scripting capabilities make it accessible to both beginners and advanced users.

  2. Standardization: Synthiam could work with standardization bodies to ensure that ARC supports the open standards for hardware and software interoperability. This would allow ARC to seamlessly connect with a wide range of peripherals and components developed by different manufacturers.

  3. AI Integration: ARC's architecture is designed to integrate with AI services and machine learning models. As part of the ecosystem, ARC could facilitate the deployment of AI training modules and provide the necessary interfaces for AI-driven robot behavior.

  4. Community Engagement: Synthiam has a strong community of robot enthusiasts, developers, and educators. ARC could leverage this community to crowdsource ideas, share open-source peripherals designs, and collaborate on AI development.

  5. Educational Outreach: Synthiam could partner with educational institutions to use ARC as a teaching tool for robotics and AI. This would help train the next generation of engineers and developers in the standards and practices of the hybrid autonomous robotics ecosystem.

  6. Developer Support: ARC could offer SDKs and APIs that allow third-party developers to create and sell their own robot skills (plugins) within the ARC platform. This would encourage a marketplace of innovative solutions and peripherals that are compatible with the open robotics framework.

  7. Cross-Platform Compatibility: Given ARC's ability to run on multiple operating systems, it could ensure that the robotics ecosystem is not limited to a single platform, thus promoting wider adoption and flexibility.

In essence, ARC's role would be that of an enabler, providing the software infrastructure, tools, and community support necessary to bring the vision of a hybrid autonomous robotics ecosystem to life.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#25  

So, I think you're all excited about "Resurgence," which seems to be the intended word or topic of discussion.of the interest in robotics. I felt it would happen once AI evolved and was democratized. AI was the missing link between computing and robotics. While there's been an excellent development for the foundation of most robot theories, it's time to put them into practice.

As any great coach would say - you've all been working hard to get here, and this is your chance to shine. You'll use everything you've learned up until this point!

#26  

Quote:

You'll use everything you've learned up until this point!
Ya, but I need to figure out how to use AI in my robot first before I can use all the old stuff.

PRO
Belgium
#27   — Edited

hi all

very cool topic . aldo i see some motions that robots stil cant do . like the boston dynamics robot . hip turning is to human like , these video's are not real . they are made by a human who, does the motion .

the cyberone very cool but they missed some great detail .  you see the butterfly lands on his hand ,it would be great , if an image of the butterfly would apear in the face of the robot, as prove he sees the butterlfy .also when he walks ,the righ foot heel is , constan of the ground . you can t walk that way . you would have a limb walk.

the appolo robot make a 180 movements withs is not possible yet , in the way he does it . also the walking gate is not correct . you see when he steps he allreddy falling to the other side , whithout correcting the other leg .

but all together is a great way to start 2024  i love seeing these video's .

thank for sharing .

PRO
Belgium
#28   — Edited

some more interesting robots .

PRO
USA
#29   — Edited

Unitree mentor is Boston Dynamics. Unitree brings real affordable robots to the consumer.

Good to see others following Unitree.

PRO
Canada
#30  

You can now buy a H1 for about 250K Canadian + taxes, import duties and shipping.  (180K USD)

User-inserted image

PRO
Canada
#31  

Figure 01 Meets OpenAI vision, speech and reasoning.   Did it really pick up the apple autonomously or was this just a scripted motion path? 

PRO
Canada
#32  

Yeah I had the same question. I haven’t seen any robots able to do object localization (with appropriate clamping force) with computer vision yet. If it can do that, it would likely be a breakthrough.

My hunch is that it was scripted, but the robot responding to the plate wobbling had me intrigued. It’s like it almost went back to stabilize it. I don’t know if that was a glitch or not, I may just be anthropomorphizing it.

PRO
Synthiam
#33  

I'm skeptical about any of the real-time movements or corrections. But only skeptical, not denying the possibility. Their access to computing power is almost infinite compared to the competition now that the two companies are working together.

The delay in speech interaction is still evident, so there's time taken to query a gpt response based on the camera image and generate a text response. They configured the assistant instruction (or personality as in gpt-4 robot skill) to have mmm's in the response.

One would assume the movements are scripted. And the gpt chooses an appropriate action based on the conversation. The depth data of the 3d image could have been provided with the query to generate the path planning, like how I demonstrated in my gpt robot arm example post.

The thing about any realtime corrections after the initial pause is interesting because the initial request and response pause is there, so how is it able to perform realtime corrections after it takes time to interpret the command. I'm still leaning toward the actions being scripted, but you never know with these guys. Again, their access to computing power is insane - so it's not out the window to imagine they could be computing realtime.

I'm sort of flip flopping on it because I think inside, I want it to be real. But there's details not provided to us on purpose, which is why I'm leaning toward the actions being scripted. But, I can see how the action path can be computed because i did it.

@Jeremie the thing about pressure picking up could be spring tensions on the fingers - so there's give. And also it could have a table of "how much force per object". Or it could have a "how much does an object weight" query that references a multiplier for the finger force.

Darn it, wish they provided more info for us haha

PRO
Canada
#34  

They should have to provide some type of disclosure but after Bard fake video who knows what is real anymore.

the apple in the start position is a bit of a give away (that’s how we would all do it) and the large soft objects in a defined space with a unique pattern for the garbage is also a very mostly pre programmed response (I wonder how many takes that took)

I do like the hmmm um aghh concept for the robot while computing response though,  I may steal that idea with a list of stall sounds, words or phrases.  Give it more of a ummm human touch.

this all reminds me of the old brain control interfaces that measure micro expressions I mean read minds. How many Venture capitalist fell for that one.

PRO
Canada
#35   — Edited

Boston Dynamics have joined the Humanoid robot using BLDC motors space.  This one is interesting although I get the feeling any more rotations on the legs and the wires will break. Boston Dynamics 

PRO
USA
#36  

Boston Dynamics is a company to watch for sure.

PRO
Synthiam
#37  

I agree it’s cool - I’m curious about the motor sounds. I’d they are super high pitched. Because they’re doing the same sound dampening as they did with atlas.

I like these advancements - because it’ll result in more interest in robotics. We’ve seen a significant increase in users since gpt. I’m gathering these new robots is why we’re seeing continued daily increase.

On a side note it’s interesting that we have more users than ever, by double. But the forum is a thing of a the past. It’s a great resource for searching, but it doesn’t work if ppl don’t share questions and challenges for new searches haha. It’s tough cause ppl don’t talk on the internet anymore. Too many scammers and arguments elsewhere - they’re exhausted. I feel the same lol

Im guessing the ARCx would also inspire a lot of new interest - for those who don’t want to be locked into windows!

PRO
Canada
#38  

I find most people have migrated to discord.  There are a number of active robot channels where people collaborate on various topics.  Forums are great for boomers and genx but millennials and gen z are all hanging out on discord (btw there are a lot of synthiam users who collaborate on discord as well).  Granted you lose control of your content but you can do things like hook Athena into discord as well as be the moderator so you can delete kick ban etc. to be honest it is very much like IRC we used in the early days of the internet.

PRO
Synthiam
#39  

Athena on Discord would cost a fortune. It seems even the boomers don't want anything to do with forums.

I use Discord for our retro NABU computing group. Most of the people there are boomers and ignore the forums. I like Discord for chatting, but the content gets lost quickly and isn't very searchable. It's too linear for me, and I find it similar to MSN Messenger or IRC.

PRO
Canada
#40  

With the new Atlas I feel kind of betrayed by Boston Dynamics. For years and years, they touted hydraulics as "The" technology to use for humanoid robots. They put so much effort into Atlas's hydraulics over the years and were a leader in this technology. For them to switch to BLDC motors seems like a necessary pivot but it means that they misled the Humanoid robotics community as they tried to make hydraulics work for so long. I feel betrayed because many said that hydraulics were too energy hungry but Boston Dynamics kept on pushing the technology. They even had me convinced over time that maybe there was a chance to make it happen. I put some time and ambition into learning about hydraulics as a result. Luckily, I didn't invest much at all into the technology but rather devoted myself to 3D printing. I'm so thankful that I didn't dive headlong into hydraulic systems, but I'm nonetheless a little aggravated that Boston Dynamics made me believe for so long.

PRO
Canada
#41  

I agree discord is a bit messy but some do a good job at categorizing a creating channels similiar to a forum

FOSS AI models are catching up with GPT now so really Athena should be something you can run for very little cost. You can experiment with Apple computer, it is a little slow but because of their architecture you can pretty much run all models on their platform without setting up a custom GPU AI rig.   My AI Rig has 6 RTX3090’s running and is quite fast (total cost about 7K CAD).  You can train a model like LLAMA 3 with the synthiam knowledge base and then pretty much run it for just the cost of power and bandwidth.

PRO
Canada
#42  

@Jeremie I think the BLDC thing is a me 2 play and they already had BLDC with SPOT but I still think they are going to need hydraulics to move any decent size payloads. I have not seen any of the humanoids actually do any actual work apart from moving very light objects around.

PRO
Canada
#43   — Edited

My guess is a bunch of pre-programmed gestures and some telerobotics but you be the judge.

PRO
Belgium
#44  

hi nink

great video .

https://www.youtube.com/shorts/Id8eCPP0qFg

PRO
Canada
#45  

Impressive work from this little startup.  The robots taping up the boxes looks like telerobotics but the first two arms appear to be autonomous using AI.  Can’t wait till synthiam does this.  

PRO
Canada
#47  

LOL yeah looks like me wrapping Christmas presents.

PRO
Synthiam
#48  

I don't know if that box taping is AI, as it looks more like it's human-controlled. I'm guessing that's a demo example of mixing automation with remote control. AI would be smoother and not notice some things, such as flattening the tapeat least not without significant delays between movements, etc.

But on that note, the most difficult challenge we'll be up against as we continue to evaluate how to implement more automation is what tools are available and how accessible those tools are. There hasn't been an "all-in-one" installer of a single tool or library that can be a central library/dependency for machine learning motion planning, etc... Synthiam provides a significant amount of data to researchers from the Exosphere system, which is used for many of the robots you see in media. But, I don't like to be the guy who uncovers the mechanical turk - let's say there's a lot less reliable/repeatable robot AI out there than you think.

The industry of AI and automation (Robotics) is still relatively primitive. So, to perform any simple task, one must install, compile, and configure several different programs, libraries, hardware, and so on.