Asked — Edited

Ibm Watson Plug In Service?

I’ve been looking into the services provided by Watson. I’m really impressed. Is this something that could be implemented like the other services we have, like Bing and Microsoft? They have a 10,000 a month limit.


ARC Pro

Upgrade to ARC Pro

Elevate your robot's capabilities to the next level with Synthiam ARC Pro, unlocking a world of possibilities in robot programming.

#2  

@fxrtst I might be able to make a plugin, can you share some more details? I haven't been following IBM and Watson services lately, there are so many cognitive services now.

#3  

I can work Also on a plugin We can share the work if you want

Jean Luc

PRO
USA
#4  

Ok here I go wish me luck...

1 invoke magic plug in maker
2 make ibm Watson plugin, make it cool
3 cook for 30 minutes on 350
4 place new plug in on 4.25 floppy disk
5 run
6 end

Hmmmm nothing .... dang

PRO
Synthiam
#6  

If you use nuget in visual studio, and type in "Watson", you will see the libraries available for it. There's a bunch that u can build around.

PRO
Canada
#7  

I tried to connect EZB to bluemix using the MQTT library but was not able to get it to connect (uses tokens) my next step was to try and put a gateway in and put mosquito in cloud but never got around to it. If you do get it working would be fantastic.

#8  

I began to adapt my text translator plugin to Watson services and Fiesta test is running fine i Will give yu more details aftershave christmas

Enjoy christmas with your family

#9  

Sorry for the typing bugs from my mobile phone

#10  

Good to make EZ Robots as smart as we can.

#11  

I would really be interested in this as well.

#13  

DataMonsters and the linked GoogleDoc gives a good breakdown of all the chat services.

But I think it is important to remembers many of theses services can do far more than offer chat services. Many of them are cognitive services where intelligent chat is just one thing that they can do.

And I think it is important that we all understand that utilizing any of these cognitive services or chatbots will not make your robot intelligent or bypass work to make your robot intelligence in itself. In fact it will create more work even with plugins to create scripts within ARC and scripts on the side of the service in order to achieve your goals from connecting to services likes these.

My concern is that sometimes from posts I read I worry some folks think connecting to these cognitive services will be for their robot the equivalent of Mario finding and eating a mushroom to become big, like in the Mario Brothers game.

And the reality is these services can offer powerful modern tools to make our robots smarter, but it will continue to require some heavy scripting to "make Mario big".

#14  

I am always curious about the new stuff out there, but yes I think @JustinRatliff is so right about this...it takes a lot of time and effort to get stuff done using those cognitive services, even when perfect ARC integration is provided!:)

The api.ai plugin works very good and the service has a steep lerning curve, so you can get stuff done...but still it is not working everything out for you, there is a lot of work and one needs to have a clear goal on what should be accomplished!

I am happy another cognitive service is being recognized in the community and there are people out there ready to give it a shot! Lets see how far we can push this!:D

#15  

Please make this so I can win Jeopardy.

PRO
USA
#16  

I have an interest in how I can use Watson not only for adapting dialog in a Watson chat bot , but also with conversation and data bases geared towards a specific market.

Let’s say for instance Alan ( my robot) is installed in a hotel or a merchant store, as an access point for information. Using data and dialog scripted around a certain product or services, allows one to adapt and fine tune information that a customer might ask about a specific product.

It’s a fascinating thing watch and unfold ( AI and chatting) as services like Watson begin functioning like complex human language for chatbots, the more people will feel comfortable exchanging information with new technologies. We now have three Alexa’s in the house and it’s amazing how quickly you adopt speaking and interacting with them. More so than Siri, which I rarely use on my IPhone. Maybe pushing that button to talk breaks the the magic?!?

Anyways this stuff is just evolving so quickly! Being able to merge these evolving technologies with EZ Robots makes those robots seem even more impressive and useful!

#17  

@fxrtst I think that is good use case for the wit.api or probably any of the other chat bot cognitive services. So to achieve that here is the recipe:


Step 1. Have a robot with EZB and ARC with connection to robot and internet
Step 2. Create a  plugin to connect to the IBM Watson wat.api services
Step 3. Setup your individual Watson user service account
Step 4. Learn how to setup the services on the Watson cloud side of the chatbot
Step 5. Actually setup and configure chatbot on the Watson cloud side (the logic for the specific chatbot needed)
Step 6. Create scripts in ARC to query the wat.api service and return results
Step 7. Adjust your ARC scripts and Watson cloud side logic as needed to return intelligent results as it will take some fine tuning to get the results you desire. 

Now in step 7, by fine tuning I mean, (this is a hypothetical but realistic example) lets say you get everything working and you configure the robot with a ready made database about "Hotels" and then create your speech script in ARC to talk to the Watson service though a future plugin and you say, "tell me about this Hotel"...and your robot says, "You said Hotels, I like Hotels, would like to know more about Hotels?" then you, "YES!" and to your horror, it repeats, "You said Hotels, I like Hotels, would like to know more about Hotels?"

That's when you know you need to revisit the logic on the Watson side of your service setup. Because it would indicate you did not interconnect some of the logical functions of the available services correctly to produce what you wanted.

This "recipe" would be the same for any of the major chatbot cognitive services like ai.api from google or any of the others.

#18  

I agree Justin I have test more Watson conversation services and it is a little bit worst than api ai Pick up forward or look forward make jd go forward We call that artificial intelligence or machine learning....

I prefer To invest in my synbot programmed intelligence.

#19  

I'd like to add going back to my "Mario eating the mushroom to get big" analogy because we've had ai.api for a while but I don't think a lot of us have used it very much because we have to know how to configure it on the services side.

Looking at just the ai.ap service, there is training available but training takes time and training sources from google and other sources are not going to be specifically geared toward what most in the EZ-Robot community would probably want to see. A lot of the training is geared towards, for example making an intelligent chat agent on Facebook with a text chat interface. Now you might think "Ok, I can take that and instead of Facebook, direct the chat to EZ-Robot" - but, you'd be wrong, the services don't reconfigure as easily as you would think without a fair amount of knowledge.

And we all are draw towards different cognitive/chat services which means our time and resources are divided, which by all means everyone should use what they want to, I'm just pointing out our chatbot creation work is divided and I don't know if that divided work has been fruitful?

So far my interest and time has been in ai.api. I have a "day job" interest in this service as well.

If there is going to be a team of sorts to collaborate on really getting Watson working with ARC and get the knowledge out to show how to get a chat service working where a team share the work I'm totally in!

PRO
Canada
#20  

One method to move forward is sign up for an IBM cloud account and use the IoT services with MQTT to connect EZB to the IBM cloud. https://console.bluemix.net

Once we get EZB connected the rest should be fairly straight forward and we can even use node red to link to watson API’s.

#21  

@Nink can you give us a tutorial or video on how to do that to get the ball rolling?

#22  

@NInk, It sounds like you have some understanding here. Can you elaborate? This is an interesting subject and I would like to learn.

PRO
USA
#23  

@justin, thanks for elaborating. That was my understanding on how all the services worked. I’ve downloaded and poked at a few services and they all seem to be very similar in function and setup. The deep learning side (AI) is what I find fascinating and I’m not sure who’s service is the best and or easiest to set up. I think Watson has had the most coverage publically with their commercials and exposure with Watson on Jeopardy but, doesn’t necessarily make it the right choice...or does it?!?

@nink, I agree it’s worth exploring.

I think it might be benificial to expand and discuss on user experiences on any of the services that are offered (AI.api, synbot, Watson etc. ).

PRO
Canada
#24  

@perry_s @JustinRatliff I have played with the API's and have also built a Raspberry Pi Based TJBOT. Cardboard Robot that connects to Watson chat services. You can find all about him here. http://www.research.ibm.com/tjbot/ and the instructables are here (you can also 3d Print him) http://www.instructables.com/member/TJBot/ He is actually fun to make and play with. Most makers will have all the parts except the pesky NeoPixel LED's but you can get them from Adafruit.

I also followed along with Bob Dills Zero to cognitive course about 8 months ago https://www.youtube.com/watch?v=Jj7IFjd3FyI It has all changed now and a lot of the pieces in the tutorial no longer work out of the box but it is a good starting point as well.

User-inserted image

#25  

Hey guys, I am the last person saying that the Ibm implementation of a cognitive ai is not worth to explore...But I think that @JustinRatliff is very right about one thing!

Quote:

And we all are draw towards different cognitive/chat services which means our time and resources are divided, which by all means everyone should use what they want to, I'm just pointing out our chatbot creation work is divided and I don't know if that divided work has been fruitful?

We here at the forum are only but a few, and it is kind of hard to get things going if the few of us will also work to gain ground in different fields...so if there is a team going to explore what can be done with Watson, it could be a very interesting thing for the community. But on the other hand api.ai is already working perfectly with ARC and @CochranRobotics contributed some good tutorials and a sample ARC Project so everyone can get started easily!

Maybe it would be cool if we would start different threats for the two services, which contain progress being made or some examples of use and code?

I do not know about Watson, but a very tempting aspect of api.ai is also the integration of the Google Assistant...you can write your own app with can be evoked by a certain triggerphrase...for example @fxrtst could use something like...I want to talk to Alan, and this would be the entry point where Alan takes over!

Also the Google Assistant works on the Rasberry Pi, and that verson can use custom wakewords..so instead of "Hey Google" Or "Alexa" you would say "Hey Alan" to start the service, from this point on you could use all the services of the Google Assistant, plus all the apps that you coded for eg Alan!

I have encountered the same problem that @JustinRatliff mentioned, which is...there is no magic mushroom for Mario, we need to put the work in to get things done and most of us are having a day job too!

Which does not keep me from being a robot enthusiast and dreaming of my robot moving and answering my questions and telling me about the weather and not to forget that I will have to buy milk today...it's not out of reach!:)

#26  

@Nink That is very good work! I LOVE the look of that cardboard robot, and will check the links NOW!:D

PRO
USA
#27  

@mickey666maus, you have several good points. I have to look deeper into the Googles implimentation. I think I may have looked into it at the beginning of 2017. ( did it change names to dialogflow ?), but I like the idea of taping into googles assustant and tying things together.

I’ll look into David’s examples and go from there. I have an important meeting in a couple months and want to try and implement some AI with Alan. API.ai might be the best route since it’s already working with ARC and I have a short timeline.

#28  

@fxrtst well I guess your short time line is our short time line.

I’ll further add this about Watson vs AI.api both might need in the long run because they do Not do 100% the same thing. IBM services come with that name recognition and if you were doing a deep dive of cancer research you’d use Watson not AI.api. While AI.api is used for a lot more common daily businesses interactions that are automated that we don’t think anything about. The same goes for the Microsoft Azure cognitive bot services which are very similar to AI.api.

The other challenge we constantly face is training material from a year ago or even 6 months on these services are usually dated! The thechnology changes rapidly.

But in the shortness of time I think AI.api is better if you are hoping several of us already interested will jump in and experiment and innovate along with you?

PRO
USA
#29  

@justin, whole heartedly agree about the rate of change. It’s actually a bit scary.

In short without giving to much away at this juncture, I will be sitting on the opposite side of the table from a very big player. If this goes well I will be in an even bigger meeting with many many big hitters. The more Alan/ Alena can demonstrate themselves in a commercial use environment the better. A mock usage will be the best examples.

I will be finishing the new Alan and Alena in about two weeks and should be able to concentrate energy towards setting them up for a case study for this meeting.

Any help would be most appreciated!

#30  

@fxrtst if you’d like to send me an email at my gmail account in my profile I might be able to work with you more one to one if you’d like?

I think we all want to see your Alan and Alena be successful!

PRO
USA
#31  

Ok will do tomorrow. Thanks.

#32  

@fxrtst Yes... api.ai was recently renamed into dialogflow, but the interface is unchanged and the integration with ARC should still work out OK!;)

And yes, there is a constant change, and the improvements made are worth keeping an eye on...so if there are people wanting to develop for the other cognitive services out there, it would be a very very good thing!

But dialogflow is a very easy to use and well integrated option within ARC, which also very valuable since it belongs to Google and we will not be at risk to be stuck with a dead end...

I checked the Links @Nink provided and it looks very interesting, we as DIY builders are given the chance to use the Ibm Watson service for our own projects and it seems like there will be a community growing at a good rate!

On the other hand Google released kind of the same thing a while ago... @ptp was already looking into this! It is also a Rasberry Pi with a microphone and speaker, which lets you implement agents being build and try them on your own system! It was sold out and I was not able to get one, but it should be back soon...

You'll find info about it here... https://aiyprojects.withgoogle.com/voice#project-overview

#33  

As Professor Farnsworth would say, "Good News Everyone", I just retested the ai.api setup plugin with a clean project and it still works even though Google rebranded it and changed the website console name, all of DJ's directions from the plugin page still apply you'll just be directed to the new website when you try to log into to the old ai.api console page.

#34  

Good news...and Happy New 2018, may it be a good one for robot building!:)

#35  

Hi Wil,

First, Happy New Year and Best Wishes for the upcoming year...

Wasn't Dave Cochran working on a lot of this with ai.api ? Is any of his work still applicable ? and usable ? I know it has been about a year old but he was going in a direction like what is being discussed.

He is not available due to his work load but can anything he did still be used ?

Ron

P.S Happy New Year @Mickey666Maus

#36  

Quote:

Wasn't Dave Cochran working on a lot of this with ai.api ? Is any of his work still applicable ? and usable ? I know it has been about a year old but he was going in a direction like what is being discussed.

Yes he was, and yes it is all still applicable... I did research a lot about api.ai too, and it is easy to use and building your own application for it is quiet doable! I will try to sit and make an example of usage..

#37  

Thanks, I plan on doing a lot more with Antonn this year. I have a lot to learn about api.ai and what it will do for my project. I will be watching for your example and will continue to try to gather the needed information.

Ron

#38  

I created a video tour overview of chat bot services to share my thoughts in video format for members that might be interested and confused on what they are and how to use them.

#39  

I looked further into the Watson service and I signed up for an IBM cloud account. The interface for the chat bot agent setup is now web based just like Dialogflow (AI.API), which is good news.

The interfaces don't appear as intuitive to me as the Dialogflow but I'm also the least failure with the IBM offerings.

I took a look at the MQTT setup and I don't feel that would be a good solution for most members because it requires a lot of steps and I don't think it's a professional setup.

I do think a plugin is very doable. If I'm the one creating it, it's a matter of time for me to do it and finding that time.

Net result though in my opinion, I'm not sure Watson offers functionally anything additional than dialogflow other than it sounds ultra posh and maybe that's all that matters.

I guess I'm wondering if from a functional stand point anyone knows of something cool that the Watson IBM cloud chat service does that we would really want?

#40  

@JustinRatliff Thanks for this thorough walkthru...I guess it nails it down to the point that all of the cognitive services are offering the same thing on a basic level, and the point where it takes a different route might be way beyond the point where we would want to got with our robots! Dialogflow seems to be a good option because its very good implemented in ARC, plus the workflow of Dialogflow is very easy to use! Not wanting to stop anyone for trying to develop for Watson...or any other of those services! :)

Also we need to be sure to get the point that Dialogflows strongest feature is to break down a spoken sentence into the intends that we create! It is never to be thought of as a simple chatbot plugin...

Amazons Web Services are also offering a lot of nice features btw...;)

PRO
USA
#41  

Sorry just jumping back in. Managed to have a kidney stone and landed at ER and been down a couple days.

@ Justin, thank you for that great video. I am stick with your opinion on Watson and suggest we don’t fix what ain’t broke and go with API.ai. And I will contact you via email after I post this.

@ mickey666maus, I agree a large part of spoken language is grabbing the intent of the sentence. Something that my 2 amazon Alexa’s do and do fairly well. I say add peanut butter to the grocery list in reality she needs to add it to the shopping list , she then replies ive added peanut butter to your shopping list. She knew what I meant , the intent.

#42  

I think there will be a great leap in chat bot tech over the next year. All of these services do not really accomplish what I want. They are all just serving back responses that I have to type in when I configure. I want something that can 'learn'. Not talking full AI but it would be nice if it could get smarter as I talk to it. I could tell it a story or something about a movie and come back the next day and ask him if he heard of the movie and could reference it. Maybe that its possible in API.AI and I just haven't figured it out yet. Seems like we have speech recognition as well as the ability to parse and understand questions. What we need is the machine learning part.

PRO
Synthiam
#43  

Perry - have you looked at this? https://blog.ubisend.com/discover-chatbots/how-to-make-a-chatbot-that-learns

Many chatbots are "intent" identifiers, not learning AI's. This means, like Will stated, they identify an intent, but not learn. It's up to your code to determine what to do with that intent.

What you're looking for is a natural language chatbot that learns - specifically isolated to user sessions.

PandoraBot has a simple example of that, by storing specific variables, such as names. You can try pandoraBot and say

"My name is DJ"

The bot will respond with

"Nice to meet you, DJ"

Then, you can ask

"What is my name"

And the bot will respond with

"Your name is DJ"

#44  

Thanks for the link DJ. I am currently using Pandorabot because it is the only one I have found that has the learning function.

#45  

Maybe this is someting that is worth exploring...it is gathering information about yourself so it gets to know who you are, and this information will be used when chatting!:)

https://replika.ai

#46  

Dialogflow has Machine Learning capability and you can add scripted functions in dialogflow for it to accept variables like names.

Dialogflow youtube Channel has some learning sources if anyone is interested.

PRO
USA
#47  

Hey Justin great find! Watching all these videos this afternoon. I definitely think that this project needs machine learning. When I review some video I've shot for Alan, Pandorabot is impressive where it remembers context and adds it into conversation. The video I shot about my "pet" goes on for sometime and feels like a real conversation.

#48  

Remember the Pandorabot we are using is the old version and they have much more extensive capabilities in their new 2.0 service. Unfortunately, it is not a good candidate for a plugin for reasons that were covered in another thread.

When I demo my inmoov the greatest reaction is when the bot asks someone their name and it starts calling them by it. At that point it is interactive as opposed to spouting random statements and answering simple questions.

ProgramAB which MRL uses seems to be pretty well configured when it comes to this. It is the only real feature of MRL that I miss after moving over to EZRobot.

#49  

This goes back to what DJ said "Many chatbots are "intent" identifiers, not learning AI's. This means, like Will stated, they identify an intent, but not learn. It's up to your code to determine what to do with that intent. "

This brings up a good discussion point. So dialogflow has machine learning and it can have rule based learning but what they are geared towards more so is text based chat.

That's what these services are designed for, Dialogflow, Watson (for it's chatbot service), the Microsoft Azure Chatbot server - they are designed to make a chat agent that is usually text based (but they can use speech easily - hopefully that nuance doesn't confuse anyone).

The machine learning can do things like if you have an intent for asking about "directions to the hotel" but you misspell it as "derections to ther hotle" - machine learning being turned on comes into play to help compensate for that.

Or "directions hotel" machine learning or rule based learned may help compensate for fragmented sentence to be understood.

And Dialogflow has a pre-built "small talk" agent which is a generalized chat agent probably close to a general Pandorbot with a few distinctions. But if you want it to adapt to what you are saying, like in DJ's example you do have to add some coding and scripting in the agent.

On the plus side, it is extremely easy to change and customize the intents for the agents in Dialogflow. If you watched my video you saw me mention that the agent's responses did not sound very "Spock" like for my "JD Spock" robot, it's very easy to adjust them.

Pandorabots is much different in my mind because it uses AIML files to configure the chatbot "brain" which is technology that has been around since 2001 and I would say prior to AIML the same chatbot setup was around for a years prior, at least since 1996. Because in 1996 I had a shareware chatbot that used text based files that could keep track of my input the same way such as my name. It was goofy. For all I know it was the great grandfather of Pandorbots. I don't say this to dismiss Pandorabots, just to clarify their technology.

Because the more everyone talks about their robot chat needs, I think everyone is saying they are interested in different functional chat qualities and I don't think necessarily any one of the services fulfill everyone's wants for their robot.

But much like the first person to put peanut butter and jelly together, if you intermix solutions, you might be on to something.

PRO
USA
#50  

Justin, thanks for clearing that up...easy for me to get off the trail. I can tell I struggle with the "one program to rule them all", the complete "push button for real life human robot"...not yet coming to a store near you. I need to focus on the task at hand, the limitations and what needs to be demonstrated:)

#51  

Justin, Thanks for the over-view. It answered questions and added more. Time and investigation will work it out. My desire is to incorporate the Omron to help drive specific chats for the face detected or the age / gender if the face is unknown. I assume agents for each detection will be needed.

Will, Have you decided to stay with Watson, or Dialogflow ? No mater which is picked, it seems development is needed to meet the needs.

I see tough decisions and work ahead, looking for the magic mushroom.

Ron

PRO
USA
#52  

I’m my opinion it looks like Dialogflow is the easiest solution. Easy as in David C has already created a framework and a great foundation. Looking thru the videos that Justin linked to, Dialogflow is simple and clear to setup. It may not be the magic mushroom but it will get the job done!

PRO
Canada
#53  

Came across this today. Make your NAO bot Smarter . Looks like a tech talk webinar type thing but maybe worth joining. More info on this page.

It is 1pm eastern Monday the 16th of January.

"In this tech talk, we will show you how you can add intelligence to a NAO robot with IBM Data Science Experience and IBM Conversation service."

https://developer.ibm.com/code/events/ibm-code-tech-talk-make-a-nao-robot-smarter/

#54  

I would also be very happy if we would keep dialogflow as our main tool for natural language and intent recognition... Its easy to use and offers good integration with our DIY projects!

I also tend to think that it is not the solution for everything...but it will help to build an intend driven workflow for our robots. It is never to be seen as a chatbot, the main purpose of dialogflow is natural language recognition and intent based response!

#55  

@DJ is there a way you could share the source code the api.ai plugin?

#58  

@fxrst Will, Owww! Hope you're feeling better (Lay off the diet Coke!)

I've followed many of the ai discussions on the forum. I'm glad we have so many talented folks here; I feel like a kid lost in the forest, and you guys are leading me out!

I've wanted/hoped for a "magic mushroom" solution for years. Every so often, I go and re-visit my chatbot plans; usually scrapping everything and picking up with (what I think is) a better method/technology/application.

My ai/chatbot needs are a lot less sophisticated (I think). All I want is to be able to have a friendly conversation with my robot, and for him to (simulate) some intelligence. I don't really want him to be cloud-based, although I wouldn't mind if he went online to get some info, check the weather, etc.

I played around with a program called "Ultra Hal", which seemed promising, and then the developer pretty moved on to other things. Pandora bots has been my second on- again, off-again choice, but setting up a local Pandorabot server is a task I haven't been able to overcome (yet).

A big part of the problem is my many interests; I switch from one to another as inspiration moves me. That's a big detriment to robot/ai building, because when the motivation strikes, everything I've done has evolved and I need to catch up.

I'll be keeping an eye on this thread!

PRO
USA
#59  

Thanks Warpig, not Diet Coke, maybe it’s the IV drip of RedBull?!

As David C pointed out a year ago these new technologies are developing so fast by so many different companies, but they are more or less the same. I think that was one reason in part why he abandoned EZ-AI.

We are just at the horizon of a true sci fi future. It’s kinda neat to watch it unfold.

#61  

For anyone looking to add more chat functions to their robot and have no intention of developing a product and maybe they have watched a couple dialogflow videos and think to themselves aspects of it seem scary to learn and implement...

I say take a look again a Pandorbots, because their interface is super simple. In fact take a look at member @Perry_S member profile and download his cloud app for "RobotMan" because it is a super clean project (nice job Perry!) and demonstrates how you can use Pandorbots.

@SteveG wrote a very good tutorial which still applies, if you use their old website. You can still create accounts and make new bots. The training interface to teach the chatbot is as simple as it gets.

You don't have to make your own Pandora bot if you don't want to, there are lots of ready made chat bots but if you want to customize yours you can make a blank one or start with a preloaded template and customize from there.

I implemented a Pandorabot myself the other day in 30 minutes that included going through the tutorial, setting up an account, creating a new bot and customizing some features. It was a lot a fun. The chat interface is really cute. There is no reason why we can't add this fun feature to our robots.

PRO
Canada
#62  

I was just having a quick play with Google API.AI and when I install plugin I get the following error, it still executes but annoying. Error on line 1: Missing String Quotes or Invalid Expression at index 0: jd

anyone else get this?

PRO
Synthiam
#63  

Post your code please. It’s impossible to diagnose and help fix your code without seeing it.

PRO
Canada
#64  

Edit: MY BAD Thanks DJ

I put the google Project ID where it said script in the API box. I deleted the name and it worked. I thought that was the name of google script it wanted to call.

actually no code created. i just installed your AI plugin on EZ-Robot desktop and followed tutorial for forward and reverse. Added my google token and app name..... Typed move forward 10 seconds in AI plugin direct in the plugin box NO SCRIPT and received "ok i will move forward for 10 seconds" response and the error "1: Missing String Quotes or Invalid Expression at index 0: jd".

Using latest version of EZ-Bulilder and Latest version of plugin latest version of win 10.

PRO
Synthiam
#65  

Only ezscript or blackly can be used in the script editor. If you do not have a script, leave the script blank.

There’s a great tutorial on the page for this plugin, that explains the script box and how it works. You can find the tutorial here: https://synthiam.com/redirect/legacy?table=plugin&id=193

To find out more about scripts in ARC, here’s a tutorial to get you started: https://synthiam.com/Tutorials/Lesson/23?courseId=1

I do not know of a google script. I’m unable to find any regencies to google script using google search engine. If there’s a feature request that you’re asking to have added, provide a link to the api and I’ll take a look.

Continue learning with the robot program. Have fun!

PRO
Canada
#66  

Hi @Jlucben Did you ever finish your text translator using watson. would you be able to upload your new plugin ?

Thanks

PRO
Synthiam
#67  

Does Microsoft have a translator? What languages are you looking to do? If Microsoft has one, I can probably whip up a plugin in a few minutes. They’re api’s are great

PRO
Canada
#68  

Hi DJ

I really wanted to use the Watson Services. Was hoping someone got one of the services working. I don't know C# .net, I guess I need to learn. I downloaded visual studio, went into panic mode and immediately started to hyperventilate.

There is a watson .net SKD but no idea how to even begin to make it work with ARC

https://github.com/watson-developer-cloud/dotnet-standard-sdk

#69  

Hi all

Plugin for language translation is already available working with Microsoft Translate_Text service and API is great synthiam.com/redirect/legacy?table=plugin&id=208

I have no enough time the last weeks to publish version working either with IBM Watson service and with Microsoft Translate_Text

PRO
USA
#70  

@Nink,

I'm almost done with my last project (cube machine), I've used two bluemix services before, I can help with the plugin.

BUT i don't want to step on anyone's toes...

DJ mentioned he was working/researching watson apis: (post #2) https://synthiam.com/Community/Questions/10828

Justin also mentioned the same.

If there is already some work, does not make sense start a new path.

PRO
Synthiam
#71  

I did install the watson library and started on a plugin - but i stopped. I found the service to be complicated to use and didn't even end up creating an account. Most of the features are duplicate of Microsoft Cognitive.

If anyone wishes to continue, go for it. There's no toes here to step on:D

@PTP, even if i create a plugin, go ahead and create your own if you'd like. Your plugins are historically better than mine anyway!

PRO
Canada
#72  

Hi @PTP really great news If you could make a plugin that worked with the Watson services this would be fantastic. I can't justify buying a NAO or a Pepper the only 2 Robots that work with Watson (So expensive and they don't do much for the price).

#73  

I agree with DJ, my findings were the same as his and there are no toes to step on because I'm not working on any plugin for Watson services.

I did create an account to continue to education myself on their services, however I have no plans to create a plugin or any applications to the IBM services at this time. As DJ stated, their cognitive services are substantially the same as many of the other services from Microsoft and Diaglogflow (AI.API) where we already have plugins.

@Nink as one of my rules for brain building and artificial intelligences I would "use the tools you have" before seeking additional new tools. Because the Watson services are not going to provide you anything net new, it would provide you a similar duplicate service with the name of "Watson" on it and the requirement to manage an IBM account on your part.

PRO
Canada
#74  

hi @justinRatliff I really want to work with Watson as I have some other tools I have been working with in bluemix in the back end that I need to integrate with. I know we have google and MS integration today and I have played with both of them.

I can't afford to personally pay for someone to develop these plugins but I am happy to donate $500 in EZ-Robot products out of my own pocket to someone who can help build the plugins.

PRO
Synthiam
#75  

What specific services of Watson are you looking for?

PRO
USA
#76  

@DJ, Thanks for kind words, doing good plugins is a consequence of two things: no timelines and being the developer & client, although between my day job clients stress or my kids asking why it takes too long... the difference is subtle :)


Regarding the Watson, i worked in a project and we evaluated a few services. I can not give any information about that particular project at this time since I've signed a NDA, all I can say is Watson maturity is very good.

Microsoft improved a lot and eventually will become stronger, due to their development and Azure communities.

For non professional markets with less complexity, more usability/easy interfaces, affordable the best choice is API.AI. Google is increasing and adapting their cloud services so can be a good for the API.AI/DialogFlow community.

Amazon & Apple they have their own agenda, unless your target is the final consumer/their clients does not make sense create/adapt your product.

As Microsoft professional I'm very happy with Azure and their Cognitive APIs although i think the Cortana is last compared with Alexa, Google, Siri.

**EDIT: Not WIT.AI but API.AI, i cut the post text where i mentioned WIT.AI/FB integration but it was not relevant.

PRO
USA
#77  

@Nink,

DJ's:

Quote:

What specific services of Watson are you looking for?

Are you using NodeRED ? I presume you want to explore the conversation api ?

PRO
USA
#78  

Interestingly enough, I was contacted by a company that will be demonstrating Watson. It is utilizing the Watson they trained with 1000 people. They wanted to up the Wow factor by utilizing Alena as a visual approach to their Watson. Alena doesn’t need to run Watson to be useful to them as I can pipe in any audio thru Alena and Alan.

But... I’d be up for PTP working on a plugin, because we all know it would be amazing.

PRO
Canada
#79  

Hi @DJ and @PTP I am trying to do a couple of things.

Using Node red and NodeJS, There is an Intu robot control program and it has a .net instance that looked interesting https://rg-gateway.mybluemix.net/ But really I wanted

  1. Speech to text (EZB => Bluemix)
  2. Process in Node Red / Node JS / Watson Conversations
  3. Speech to text (Bluemix to EZB)
  4. MQTT (or some other method) to Servos (Bluemix => EZB)
  5. Sensors to Bluemix (MQTT)
  6. Object Recognition

I want to ask EZB to recognize an object and then interact with the object using voice commands.

PRO
Synthiam
#80  

Have you tried using The Robot Program to teach ARC using object recognition? You can add speech recognition to it as well. What are you looking to do? There's already that functionality in ARC, and we provide great The Robot Program episodes on how to do it. Have you started there?

PRO
Canada
#81  

@DJ Yep I wanted to use Watson Services not Microsoft thanks

PRO
Canada
#83  

WOW very cool it works. THANK YOU Curious how to make it loop. I say watson move forward. Executes the script and then that is it. So I have to unclick and reclick listen and STT to make it listen again. last line of watson phrase script is ControlCommand("IBM Watson Services", StartSpeechToText) but debug window says speech stopped

Love the detail in the interface with DB settings and delays. Audio graph is cool really helpful in background noise.

BTW what does button 1 do :-)

PRO
USA
#85  

@Nink, I've change the way the before and after scripts are handled let me know if your issue is fixed.

The "Button 1" is always pointing to some last minute test code. Removed!

@Mickey666Maus: I would love to believe that, but, all the facts point otherwise:)

I've updated the plugin to support Text To Speech. I've added some voice clips, let's hope the translation is correct.

PRO
Canada
#86  

@PTP fantastic. This is really cool. Works really well. I think we should start a new thread for this one now PTP has made the original request by @fxrtst real as this thread is really busy and mostly off topic.

#87  

What service does the plugin connect to in the Watson services? Is it just Text to Speech? Or is it all of them?

PRO
USA
#88  

@JustinRatliff:

Speech To Text and Text to Speech.

PRO
USA
#89  

Holy cow! As always job well done! Looking forward to test driving this. Just finishing current robots this week! Hats off to you good sir.

#90  

@PTP : I tried your Watson - visual recognition services plugin. Thank you so much for putting in all that effort!

I installed your plugin. I created an IBM Cloud - Watson account. And after bumbling around the IBM site for about 3 hours I finally created my project and API key. I configured the plugin and it IS WORKING FINE!

Also, I also tried the TTS and the STT and I got both of those working too.

Thanks again!

Thomas thomas from la (at) g mail (dot) com

cool:D cool

PS. Do you know if Watson has the ability to locate the objects by x and y coordinates?

#91  

I'm soooo impressed and grateful with today's technology at our disposal. Just blows my mind. :-)

The last few days, I finally spent some long overdue time playing with several applications and services such as:

  • Spectral voices for EZB TextToSpeech
  • Microsoft Azure TextToSpeech and TextTranslation
  • PendoraBot
  • IBM Watson Services (SpeechToText, TextToSpeech, and Assistant) with the Plugin.

After many hours of fooling around trying to understand how Watson and the plugin works, I came to a total halt. I can not get the final details to work; for the SpeechToText and TextToSpeech services, I don't get any Voice Model to select. In the Plugin, it says "When the configuration is called, the plugin logs the existent models and voices.", but I get nothing ?!?

When I activate the service (check the Active box in configuration), I get lots of error message in the Debug window. I attached a capture file, I don't know how to add image yet.

Similarly, in the Configuration-Assistant tab, I don't get any workspace selection. But this might be because I haven't create one yet in Watson. Will look into it later...

User-inserted image

User-inserted image

#93  

@ptp Got the EZB plugin. Got the URL/Username/Password. Filled up all configuration details in the plugin.

My problem is I can not get any voice model selection. When I click on Model, the only selection available is "No speech models (check credential)". Are those speech model from my windows speech (voices) or from IBM Watson Service cloud that I have to configure or select or whatever ?

#95  

Hey @kazobond

I'm sorry that I can't help you but I've got a question on the following plugins that you tried for speech to text and similar.

When you will be able to run the Watson one. Could you please tell me which one was better than the others ?

You would save me a lot of time:)

Cheers !

PRO
USA
#96  

It's seems you have an invalid endpoint url:

User-inserted image

There are some characters and spaces before the "https:"

User-inserted image

#97  

Yooohoooooo ! It works !;):):D Thanks @ptp. I knew it had to be something real stupid, but I went over it a zillion time and never caught it. Dam copy paste (hihihi) Way too much new stuff to absorb. :P Now I need time to experiment. Still have so much to learn, but exciting blush

@aryus96, will do for sure. Stay tune.