
fxmech

I am new to ARC. I purchased the premium subscription but was disappointed that it keeps kicking me out of skills as I'm building because I reach daily limits. How do these limits work? Is there a way to get unlimited time on these skills? The latest issue was with Cognitive Face, but it also happened with Bing voice recognition.
I got around that by opening up an Azure account and setting Advanced Speech Recognition with my own API Key...
Not sure how to handle Cognitive Face though...
In summary, those robot skills are provided to Microsoft paid service as a courtesy for experimenting.
In the meantime, the one robot skill you have selected for the question is "advanced speech recognition". You might have meant "Bing speech recognition", because they're similar but different. You will notice in the manual for these robot skills that the "advanced speech recognition" requires your own license key to be registered with Microsoft. This allows having unlimited (as many as your wallet can pay for :)) transactions with their server.
If your application requires a dedicated license, let us know and we'll provide you with a robot skill where you can get your own license from Microsoft and have as many transactions as you require
It appears the cost is $1 USD per hour of speech recognition. If the average speech recognition phrase is 3 seconds long, that 1,200 recognized phrases for $1 USD.
I was able to go to Azure and get my own API Key for speech recognition, but was surprised at how quickly I was getting booted out of cognitive face...
Since I am still learning, it feels like I can only use it for a very short period of time and have to stop because of the limit.
I may be doing something wrong where it checks cognitive face every single loop and reaches a maximum limit faster than it should...
This is literally my very first attempt at writing an EZ-Script so it wouldn't surprise me one bit
This is how I tried implementing cognitive face into my robot:
controlCommand("Cognitive Face", "Detect")
if ($FaceEmotion_Happiness>80)
controlCommand("Auto Position", "AutoPositionFrameJump", "Smile")
elseif ($FaceEmotion_Sadness>80)
controlCommand("Auto Position", "AutoPositionFrameJump", "Frown")
else controlCommand("Auto Position", "AutoPositionFrameJump", "Neutral")
endif
1) Was it in the "Tracking Start" of the camera skill?
2) Or was it in a script of its own looping over and over?
My recommendation is to add it in the camera skill as "Tracking Start". Then, enable Face as a tracking type. That way, every time it sees a face it will execute the script.
I will move it over and try again!
Code:
Then, put this part in the cognitive face config...
Code:
Then, turn on FACE tracking type. So every time it sees a face it will instruct the cognitive face skill to detect
I've done a lot of Arduino programming but never moved into this type of environment before.
Another quick question to help me wrap my head around this..
If the camera is running at 15fps and the start script waits 15 frames before it executes, I think that means that cognitive face will receive a request every second unless it looses tracking.
Is that a good amount of requests or will that still cause me to overuse the free service and get locked out fairly quickly?
This is just a first test so all I am trying to do is get the robot to mimic my expression... sad, happy, or neutral.
I still cant get over how close I got in one lunch break! I cant wait to learn ARC better!
This type of programming for ARC is event programming. It's how applications on multi-threaded operating systems work, such as Windows, Linux, Android, or iOS. The events are executed in their own thread and can trigger other events, etc...
Regarding the camera TRACKING START question, the script start will only run once for detection. It will take 15 frames for the detection to be valid, then the script runs. The script will not run again unless tracking is lost. So it's not a loop for every frame - it's just one time for when the tracking has been "identified"
So for your case, the robot will need to "lose sight of your face before tracking again"
If you wanted the robot to detect you in real-time, then what I would do is modify the Camera Tracking Start script to loop every 5 seconds or something. Because that script will stop when tracking has stopped. You can also add a camera TRACKING STOP script that speaks something like "hey where did you go?"
If you wanted a constant detection of your emotion once the face is detected, put this in the camera TRACKING START script... I'll put an EZ-Script example but I prefer javascript or Blockly generally because it's much faster.
Code:
And in the camera TRACKING STOP put something like this...
Code:
Or if you want it to talk out the EZB go...
Code:
Actually I appreciate you overcoming your pain to help with this issue. I am learning also. I'll be installing a camera in my robot soon and this is very helpful. Thanks.
Issue is now resolved