Asked
Resolved by DJ Sures!
I am slowly getting all the quirks with the T265 worked out by docking etc. We really need some distance sensing as well so we can get better detail of our surroundings. @DJ are there any plans in adding a distance sensor like a D435 (or D435i since we already have an IMU in the T265).
Just trying to work out what I should purchase in anticipation of some future depth sensing capabilities.
I am slowly getting all the quirks with the T265 worked out by docking etc. We really need some distance sensing as well so we can get better detail of our surroundings. @DJ are there any plans in adding a distance sensor like a D435 (or D435i since we already have an IMU in the T265).
Just trying to work out what I should purchase in anticipation of some future depth sensing capabilities.
I'm struggling with the D435 because of the poor reviews and discontinued status. The T235 was a pretty big risk to add support for because of the "Intel way of abandoning customers".
So far the best bet is to agree on a lidar that you would all like to see supported. The Kinect was a disappointment - and I had high hopes due to all of the blogs with people demonstrating great successes... until I realized we're dealing with a Boston dynamics type scenario; where there's one usable result out of a thousand takes.
Really, I'm not sure what direction we should be heading. If we invest time into supporting a sensor that doesn't work, you'll be knocking on my door asking "why not" and my only response is "hey, I didn't make it" LOL. Any feedback on what direction you'd like to go is encouraged to help us - I am learning toward lidar.
But remember, any distance scanning device pushes data into the NMS so it's not really a lot of work. You can use the Javascript NMS methods and push data into the NMS from any device yourself as well.
We have heard rumours that Boston Dynamics is using the Realsense D4XX on SPOT. We have also seen it in Unitree, Dogotix and ghost robotics. If these robot dogs do take off, I think intel may have a steady customer base moving forward and a reason to keep the product in their supply chain. LIDAR is great for a 360 view but if you are trying to get a robot to climb up or down a flight of stairs I assume you need a direct line of sight to area you are heading in.
What use cases do you see for the Boston dynamics robots?
In a conversation I had yesterday in discord, we now believe that every BD SPOT has 5 * Intel D430 modules. The D430 is the same module in the D435 but has a custom RGB sensor added. Today SPOT costs 75K but Hyundai purchased them for mass production. I would estimate actual manufacturing and parts costs would be < 5K and they will probably retail once commoditized for under 10K.
I believe BD use cases fall into 3 categories The Good, The Bad and The Ugly.
The Good. Unlike a wheeled robot or a drone, a legged robot can go into areas that are accessible only by walking. Although our world has been transformed to support accessibility standards as soon as the power goes out or an alarm goes off, the lifts stop working. If there is debris in the way or hazards to navigate our wheeled robot friends will no longer function. If you have ever flown a drown in doors this requires a significant amount of skill to pilot a large drone and inevidably results in a collision in confined spaces. Small nimble drones do not have the battery to last longer than ~ 20 minutes and can't interact with surroundings (open doors, windows or move objects etc) and this is why humans still perform tasks in these environments today.
The ability for a Legged robot to function in these spaces makes them ideal candidates for any places that are not safe for humans. First responders (Firemen looking for survivors in a burning building, Police responding to the scene of a shooting, Bomb squad inspecting a package) are good first examples where SPOT is currently being deployed. Hazardous environments such as chemical leaks, nuclear power plants, machine plants, oil rigs, mining etc all provide use cases for SPOT to perform a task where a human should not.
The Bad: At under 10K SPOT like any robot will replace jobs. The security guard who patrols a building, the field tech who monitors and inspects equipment in the field, the auditor who verifies assets etc. There is a whole range of tasks humans perform just by walking around and observing and interacting with their surroundings, recording data and making changes as required. Initially these roles will be performed in a semi autonomous fashion with human operators remotely monitoring multiple robots simultaneously (Think Exosphere) to eventually fully autonomous robot that through repetition and reinforced machine learning will be able to perform these tasks without assistance.
One could argue as these human operated jobs go away, people will find other high value tasks to perform that are more suitable to a humans skills set. I recall the initial concerns that computers were going to take our jobs away but we all now have jobs designing, programming, operating and repairing computers.
The Ugly: I don't want to go here but I think we need to be realistic and understand that people will always use technology for nefarious purposes. BD original R&D was funded by DARPA and I don't think their interests were exactly humanitarian in nature. Dogs of war are coming and those trucks full of soldiers that fight wars in cities today, will soon be replaced with trucks full of robot dogs that have guns strapped to their backs. The US economy is fuelled by war and that is not going to end soon. This sadly will be one of the largest markets for Robot Dogs. Our children who spend every waking moment playing video games where they remotely control avatars to kill their opponents, have now become experts in their field in remote controlling killing machines. They have also become completely desensitized when it comes to indiscriminately killing an opponent. Our future wars will be fought, won and lost by an army of highly skilled teenagers remote controlling war dogs who will also be inadvertently killing innocent people and children with little to no remorse. What ever happened to the 3 laws of robotics?
@DJ said
Did you look at the two I posted from Robotshop? (there is actually a 3rd slightly more expensive but more self contained one now, but it is the same SDK as one of the others YDLidar ----$159 instead of sub $100). I would say if you look at the SDK's of both and one would be easier to write the skill for than the other, we would all be happy to settle for it (the next least expensive is over $300. I don't think many of us are interested in that much for a single sensor). It seems that YDLidar is the more active vendor with several devices in multiple price ranges, and I believe they may all use the same SDK, so if it is useable, that might be the best option.Here are the links again:
https://www.robotshop.com/en/rplidar-a1m8-360-degree-laser-scanner-development-kit.html https://www.robotshop.com/en/ydlidar-x4-360-laser-scanner.html and the new one: https://www.robotshop.com/en/ydlidar-g2-360-laser-scanner.html
Alan
If i can help in any way i will be happy to. Well, thats my 2 cents...Have fun
Manual in support section: https://synthiam.com/Support/javascript-api/Navigation/updateScan
Thanks for adding that to the manual.
I think the Navigator and T265 have been the best steps forward for my interests in robotics. They really add so much to ARC. They are awesome!! Lidar with basic obstacle avoidance, would be another great upgrade to ARC. Thank you, Steve S
That’s been in the manual since the NMS was released and has a nice write up in the NMS page. The good thing about that support section is the great structure along the left side of topics. I find it super helpful and mostly keep it open for reference in a browser on my desktop.
Also makes for a good evening read! I browse it often to stir up new ideas as well.
Oh i apologise. I go to the support section often and it is very necely organized.
Nice - thanks. We’ve been working hard at it.
I just ordered a d435i. My gut is still a bit weary about it and can’t quite explain why. Guess we’ll see! And my fingers are crossed. Newegg had the best price ($30 less than Amazon at $309 cad including shipping)
wow what a coincidence I just ordered a D435i on newegg as well. My robot dogs knees will be forever in your debt.
NewEgg Intel Intel D435i RealSense Depth Camera OUT OF STOCK $235.99
Amazon $312.59 $3.99 delivery: March 24 - April 8
Ships fromRAREWAVES-IMPORTS Sold byRAREWAVES-IMPORTS (156169 ratings) 85% positive over last 12 months
Amazon
Intel RealSense Depth Camera D435i, Silver (82635D435IDK5P) Brand: Intel RealSense 4.4 out of 5 stars 13 ratings | 7 answered questions Amazon's Choice for "intel intel d435i" Price: $210.13 Overnight 7 AM - 11 AM & FREE Returns
Intel:
https://store.intelrealsense.com/buy-intel-realsense-depth-camera-d435i.html?_ga=2.244802892.814215966.1615303606-1125100293.1615303606
$199 (US) (might charge shipping though)
Alan
Anything we buy in Canada from US involves, shipping tax import duties and a customs handling fee from DHL or UPS. I ordered a $50 battery last week and it cost me more in all the overhead than just the battery. So much for USMACA.
I’ll never order anything from the USA again lol. I recently ordered $100 usd floor mats for my car. I paid $70 import brokerage. It’s outrageous!
I use Digikey when I need to source electronics and multiple vendors e.g. Adafruit, Sparkfun, Intel, DFRobot, Seedstudio. sometimes Mouser, and Farnell (aka Newark in US/CA). D435i 286 $CAD : https://www.digikey.ca/en/products/detail/intel-realsense/82635D435IDK5P/9926004
Do you have to pay the same for shipments from China?
I will be honest I purchased mine second hand on eBay. :-)
China is hit or miss. Yesterday my Pi X turned up and I didn’t get charged anything but Jeramie did get charged.
@DJ What about the Lidar?
@ptp Do you have the D435i?
What about the lidar? LOL your question is lacking context haha
You asked us about what lidar we would like to see supported. Is there a winner? Lol
DJ said:
My post in this thread from several days ago, responding to your question from last week..... There were several posts following this with opinions.I would still be interested in the Lidar capability also. Steve S
Alan
I totally agree with you Alan, sounds like we have the same goals for our robotics. Steve S
Lidars: Neato XV-11 (source ebay) I have one requires: A) an extra micro-controller e.g. Teensy and h-bridge OR B) surreal controller I have both options A,B. I use with ROS and it's a cheap alternative.
Hitachi Lidar (Robotis) I have one and it works well in ROS and with DJ plugin. I got one for $135 from Robotis, unfortunatelly the price is UP and there are other cheap options.
I played with a few lidars and if you are a lab rat or doing serious robotics SICK and Hoyoko lidars are the top choice, the cheaper lidar is: https://www.robotshop.com/en/hokuyo-urg-04lx-ug01-scanning-laser-rangefinder.html ($1000)
So for 10% or 20% of the price (100-200) don't expect the same results, so in that interval we have a lot of options and clones, so far I don't feel motivated to invest in other options because XV-11 and the Hitachi Lidar works.
I know where $100-$200 Lidar limitations are so maybe later down the road I will try to save money to get a Hoyoko or SICK lidar that works outside but I have other priorities.
Returning to the amateurs reality...
I found this one: https://www.ebay.com/itm/360-degree-LiDAR-sensor-ranging-distance-scanning-module-ROS/313448191749
Is the Hitachi Lidar that is available in the ebay and aliexpress ($38-$50) I'll order one to test.
@ptp I tried last month to buy the Hitachi Lidar, and every vendor on eBay turned out to be the same Chinese vendor and was actually out of stock. Also out of stock from Robotis. Your eBay link above shows that the vendor is in NY. I will give it a try since we already have a working skill, but I really think we need a plugin for one that is more commonly available.
Oh...... this is not the same one from Robotis is it.... Oh well, ordered, so we'll see if it is compatible.
Alan
@Proteusy:
No. I have a stack of 3d sensors, from intel I have the R200 and ZR300 plus Kinects, Asus etc. All of them have a common goal to provide a point cloud, from there you can do "3d navigation" or convert to 2d Laserscan.Unfortunately Intel support is what it is... (I've explained in another post)
Most professionals and researchers are working with Unix and ROS, there are few ROS packages and the common requirement is to obtain a Point Cloud i.e. a bunch of 3D points.
So high features are always left behind, plus problems and issues with consumer operating systems i.e. Windows, but, once again is easy to understand if you are building a professional robot you don't use a Windows desktop on the Robot.
If you are Windows user or amateur robotics user soon or later you will hit an issue and you will experience their support.
To make it short, I plan to get one Realsense camera maybe a D435i or D455, the later model the RGB and IR images have the same resolution so allows a easy map between RGB and Depth data.
I'll wait to see what DJ plans to do.
If not the guy said it works with ROS, so I'll port the code.
Cool. Just got a shipping notice already, so at least they are really in stock with him.
@proteusy: I have a friend in Portugal looking for some hardware and local tips and I would like to introduce him to you, can you please drop me an email ? (My email is in my profile)
Thank God I'm a hobby robot builder, at least at home. The Rplidar is perfect for what we need. At work I play with Sick laser scanners, not 360 but 180 degree ones and our diagnostic software is run in windows. Ptp could you share what the goal for your robot is?
@ptp You have mail.
Any lidar can be added to ARC via nms. It can be done by either creating a plug-in and passing for degree distance data to the nms. Or by writing it in JavaScript or python snd using the NMS navigation command to send distance data: https://synthiam.com/Support/javascript-api/Navigation/updateScan
One can't expect a $100 to perform like $319, $519.
Plus there are other issues related to the update rate, and precision requirements to generate good maps on top of that there are factors like robot speed, if your bot is slow like a vacuum cleaner, a slow rate and cheap precision is not a problem but can be if you try to move with higher speeds.
PTP, I believe you meant "academic" robots instead of "professional". The robot industry is far too small for the word "professional". The only industry to demonstrate a market penetration of robot products is education for academic use. And even education robot revenue is insignificant compared to most anything else, such as Chinese restaurant plum sauce packet sales. There has never and will never be a scalable industry that is based on a foundation of complexities that limits entry and scalability.
There aren't enough programmers in the world to sustain and scale businesses requiring such a high level of experience/knowledge for hiring.
All consumer-facing products, software, support, and UI migrate to Windows for the reason stated above. This also applies to support staff scaling requirements for businesses.
The only place where the engineering ego receives attention is in academics. Businesses focus on the highest percentage of TAM revenue while maintaining the lowest operating cost to support new product development, growth, and market fluctuations. This is why all of the robot companies keep going out of business because they're founded and operated by engineers.
*Note: What I'm getting at is very few people want to know "how it does it", yet everyone wants to know "what it does". The choice of OS or complexities that the product is built with is irrelevant. And we should reserve the word professional for when robots actually do something consumers find useful.
Professional is very subjective... you are a good example of an exception to a market pattern, you created ez-robot later synthiam to deliver software to learn and run robots without ROS or Linux. I believe there are many other examples e.g. Lego Mindstorms (Started from talks with MIT Logo Turtle language, you mentioned last hack night) and there are lot of companies delivering solutions based on MIT scratch or Google Blocky showing that is possible to create and run robots without MSc or PhD on Robotics, Electronics or Mechanics.
But none of us are professionals we are consumers of Synthiam, EZ-Robots products and we are not professionally creating products, we are enjoying the flexibility, friendly tools on a friendly operating system Windows (used in desktops and personal computers) and obtaining nice results without understanding all the tidbits and manbo jambo required to build and operate robots.
I feel like we just hijacked the thread.
I do feel like we are all contributing to professional robotics - in the sense that someone will make a robot product that matures the robotics industry. Moreover, successful robot companies will do so without needing to own the full-stack. Synthiam does a lot to support startup robot companies, ranging from r&d space exploration to surgical robots. Would those run windows in a production environment? You would be surprised to know the Divicini Surgical Robot runs windows. Same with the robot arm for flipping fast-food burgers run Windows.
The reason comes down to the cost of maintenance and support - if McDonald's were to employ robots running Linux and ROS, can you imagine the type of staff they would need to hire to maintain the robots at each location? A Mcdonald's at a truck stop in the middle of nowhere suddenly needs an on-call Ph.D.
So my philosophy is based on reviewing the evolution and maturity of industries our society depends on today. The most comfortable industry to overlay on the robotics industry is Computers. Computers started as complicated things that hardcore geeks continued to make challenging to save ego. Nevertheless, suddenly these companies made computers available to everyone, and BOOM!
However, what happened to those nerdy geeks who made their computers and used Unix/Linux? Today they work at datacenters maintaining VMs - sure, it is a pretty big industry and can be responsible for supporting websites. However, while Linux is the infrastructure for computing, its revenue footprint is insignificant to the industries' size that depends on it.
Where is Synthiam heading? We do not expect ARC and Synthiam's current UI to be a forever thing. It will change, but the world is not ready for it to change yet. Fact is, no one has used ARC to its fullest potential - there is much legroom left in ARC. Need to see some product ideas (proof of concepts). I was hoping the challenges that Covid introduced would shake a few things up and result in some cool robots...
Robomodix was all about trying to make an emotional connection (HRI) with a robot that provides information and communicates like a person. That way the robot performs (information, scheduling, chatting), in return the user feels satisfied, and instead of wanting it to dock it or have it sleep, they want to continue to use it. An emotional connection has been made.
Edit: and yes we have hi jacked this thread...
Alan
My Intel Realsense D435i arrived today. However, it's damaged and doesn't work. It'll be another week before I receive the Newegg replacement.
I got a brand new Hitachi lidar from the ebay and it works more details here: https://synthiam.com/Support/Skills/Navigation/Hitachi-LG-LDS-Lidar?id=20086
So everyone can start avoiding the obstacles yay!
PS: I only tested the Hitachi lidar plugin, not the other plugins: NMS and the navigator, I presume everything works together and no fun without a T265.
Newegg refunded me because they no longer carry the d435i.
I guess we continue the hunt for depth sensing. At least we have a LIDAR now.
We have Lidar but no mapping and path planning. So what is the Lidar for?
Here’s the link to the robot skill tutorial, it’s super useful for extending or building robot skills: https://synthiam.com/Support/Create-Robot-Skill/Overview
Any of the existing skills can be tweaked or extended. If you have additional requirements, give it a walkthrough
Also, I should add that there’s JavaScript and python scripting in ARC to extend existing robot skills. The NMS specifically has great JavaScript and python commands.
Something useful to use is Synthiam’s Skill Store to discover robot skills. Click Products -> Skill Store from the top menu. There’s a path planning robot skill that could be used here: https://synthiam.com/Support/Skills/Navigation/Wavefront?id=15874
Hey DJ don't give up on ARC just yet, in many cases like mine the whole last summer was ruined by lockdowns and Wannabe cops that were useless security guards. I am slowly relearning scripting again for Java script.1 step at a time but getting there.
@DJ:
If you are still looking for the d435i: https://www.digikey.ca/en/products/detail/intel-realsense/82635D435IDK5P/9926004@PTP that's a pretty good price I assume it is in CAD because had Canada selected
I’m having difficulty wrapping my head around creating a robot skill this product. It’s practically impossible to buy, it’s discontinued, and the api is really buggy and discontinued. How will people experience the robot skill if no one can get the hardware?
Intel makes things so difficult.
anyone else want to make the robot skill? I think it would be quite easy by modifying the Kinect navigation robot skill. They almost do the same thing
I do think at some stage an official 3D depth sensor needs to be supported as a robot needs to be able to judge distance on more than one plane. This going to be crucial for any robot navigation in complex real world environments. Last thing you want is your uber delivery robot knocking down an old lady because we failed to see her walking. There are a bunch on the market but I guess someone has to take a leap of faith and select one to support. What sensor? I said D435 as Intel seemed to have some history in this space and I would hope at least a product roadmap https://rosindustrial.org/3d-camera-survey
The NMS can support any of the depth sensors you want to throw at it. Simply follow the tutorial to create a skill, and you can build off existing skill source: https://synthiam.com/Support/Create-Robot-Skill/Overview
I'm not sure what the link to the ROS link is for?
I thought the entire concept of ARC was you didn't need to know how to code but now you need to code to create a skill. Did I miss something?
My thoughts exactly nink. The ARC concept. I can do some simple coding in python but modifying a skill or create one...nop
Check out synthiam.com to find out what Synthiam is all about. Also, the about section is super useful. I personally make robot skills to contribute to the platform for people to use, such as your WebSockets that I thought would be useful to ya. Synthiam does the development of integrations for customers, such as the recent Tello robot skill, etc.. You can certainly hire us to create a skill that helps you achieve your goals. But if you're just using ARC to play around, it may not make sense to invest that much into it and keep experiencing robot skills as they popup. Generally, if you see a robot skill appear it's because someone paid us to make it.
If you can code a bit in Python, then you can also use the NMS to connect to any sensor or service as well. The NMS has python commands exposed
Unfortunately this is a hobby so I don't have the funds to pay for custom code. If anyone wants (and has the skills) to create a plugin for ARC for a D435i I am happy to buy one and send it to you. Thanks
You know I’ll end up making one lol. I ordered one but confidence is low for it. But we’ll see!
I think my priority for anything navigation related is some path planning embedded in The Navigator. I hope to have time to do that in the next month or two. If I can get a small break from customer support. That eats up most of my time in a day lately.
I don’t really get a chance to play with robots ever. It’s kinda sad really LOL. I started this cause I wanted to build robots but now I just write manuals and answer support questions LOL
@Nink, DJ has been trying to convince others to code plugins since he added the plugin concept to ARC/ARC. Unfortunately, except for @PTP, most of the people with developer skills who were doing things with the SDK have moved onto other places for one reason or another before the plugin concept was created.
I personally have been trying to find time to update my skills so that I can write my own plugins to share. I used to be a decent Visual Basic 6 coder, but dotnet coding is very different, and I just haven't been able to spend the time to wrap my had around it. Also want to learn Javascript since it seems that is the direction forward over EZ-Script. Unfortunately, I just don't have the time to spend on that right now. I thought not commuting because of Covid would buy me some time, but I found it quickly filling up with working on deferred maintenance of my almost 50 year old house, and learning guitar.... But I am finally getting a little bit back into my robots. NMS has been a big inspiration to get me back onto robot building. It really is the key functionality I have been waiting for that changes mobile robots from essentially advanced RC cars to real robots.
I should point out that I suggested the plugin idea (I used slightly different words) in November of 2011. Had @DJ created it then rather than many years later, I probably would have had the time to sharpen up my skills and would have been contributing https://synthiam.com/Community/Questions/An-SDK-idea-7023
Although, @DJ..... It doesn't help to get others to write them when someone does say they are starting one, or figures out another way to do an integration, and you turn around and publish one overnight. I have seen this happen a few times and I think it may sometimes discourage others from innovating. On the other hand, yours usually work, and some of the other ones out there that were developed don't and their developers have disappeared.
Alan
Hi @Alan I appreciate the input and I always appreciate the help you give me on the forum when I can't figure something out in ARC. I haven't written code in 30 years and I think I am a little past the stage to learn (can't focus and bad memory). I agree on the NMS front. It works great for a Roomba or small wheeled robot with a bunch of ping sensors and it knows where it is going (for the most part) and doesn't smash into things (as much). I guess we were suddenly given something that made our robots somewhat autonomous and now we all want more. It is a bit like giving a kid who has never had candy before his first taste. Now he wants more and to move onto chocolate bars and other confections. We are hooked.
The thing about the NMS is that it’s only the messaging system for navigation. It’s only as good as the input and output. Interestingly, the NMS has actually been in ARC since around 2014 but it was only called Messaging Service until it was renamed and used with the realsense.
The reason the NMS didn’t get any spotlight was because there haven’t been a lot of advancements in navigation sensors etc. I was hoping the realsense would ignite completion in that space, but sadly no.
the main trouble stems about stagnant growth in robot sensors is the cost and application volume. You see, companies that invent a lidar or single sensor are so proud of it, that they feel it’s the heart of the robot revolution and they charge arm and a leg.
so when you add up all of these inflated ego product prices of each sensor product to assemble a robot, you end up with a cost that’s unbearable.
Take realsense for example. Not only is the price unrealistic as a BOM item in a product, but it’s unattainable to purchase. Intel discontinued the realsense after the first manufacturing run. It’s like they made 10,000 units and moved on to the next thing.
other navigation stuff, like industrial lidar makes me laugh. They price these units about the cost of a reliable used car. Meanwhile, selling 1 is a victory for them LOL. Sure, the single application gets some media attention and the robot report writes that the robot revolution is around the corner hahaha. That seems to happen ever few months. Usually when a floor cleaning robot comes out, but now it’s robot arms.
Anyway - I should add that it’s not customer support on the forum that takes a lot of my time. The forum is just a few old schoolers like yourselves that I enjoy talking to. The customer support I spend a lot of time on are the volume license enterprises, like schools and startup accelerators and a few r&d labs. But they are the reasons behind many of the usability and stability updates that you’re experiencing in arc. Also, the support section continually grows based on their interaction.
it’s a blessing in disguise because the manual section always needed some work
Hey DJ, when you had time you could do a hack night with us about improving NMS, and plug-in creation. We could try to do something together. Just a suggestion. I have my beer ready.:)
Beer and coding always results in something fun
the trouble with coding on a live hack is focus and time constraint. It takes many many hours to write simple functionality. That’s why I avoid it for complicated things. Like, it would take a few days to code some path planning into the navigator. Or a week to code some map editing snd saving features, etc..
I have to clear days of uninterruption to tackle development like that
also, since everyone is working from home for the last 13 months, we haven’t even been to the office in forever. I don’t even know why I pay tens of thousands per month for an office. It’s expensive and not even being used. So live hacks take the back burner LOL. I’ll probably shut the office down when our lease expires and move my lab back into my house. Probably do a lot more live hacks that way!
Wow 10,s of thousands! I do get it that you do best work when concentrating uninterrupted. When I had my huge Motor home out in nature, I got right into learning code for the old EZ scripts,nobody bothering me knocking on my door all the time and so peaceful,the good old days. @Alan , hey I was trying to learn guitar last 10 years but I still suck LOL! I have an autograph Paul Stanley guitar from Kiss Fame. I mainly just do power chords and use my distortion box to make crazy death metal sounds.Or ACDC!
@DJ I wondered why you kept that big office after you sold EZ-Robot. Seems like an expensive way to store pinball machines Most software companies I know run their business off a laptop and move into shared office spaces where you can meet customers and book a board room etc when you need them, although you maybe better off in one of the innovation Hubs where startups hang out and collaborate and you can partner with other Tech Companies . Good for visibility, Angel investors, VC's etc (but I assume you probably know everyone in Calgary anyway).
I love that office, that’s why. Designed it from a blank slate. If you had a chance to visit, you’d know what I meant that it’s a reflection of myself
But it doesn’t make sense if it’s empty / even from me. Especially when you consider how absolutely massive it is LOL!
I would always picture your office kind of like Willy Wonkas chocolate Factory-- except wandering robots instead of Midget people and chocolate rivers LoL!
It is a combination of both lol
@DJ I still remember when you moved to that office and had 3 live webcams and we could follow the remodeling process and later all you guys at work... Love that space!
Sad that we had to take the webcams offline. There was a staff member who had issues with being broadcasted - even though the image resolution was 160x120 and everyone looked like a blurry pixel HAHA
I have the footage from the very first webcam all the way until today. They're stored on a server raid array. When the drive fills, we replace it with another and put the old drive in a storage box. I think each drive is around 4tb and there's about 4 or 5 of them. So around 20tb or so of video since 2013. I think it was 2013 when we moved into the first office.
LoL! I did feel like I was intruding when I could see people playing on the Arcade machines or eating food so I would try to never look,just go straight to Questions.xD
Thank you so much for doing this. My robot is in pieces right now but as soon as I get the parts I will put it back together and try this out. VERY COOL THANK YOU