
Charel

I am preparing to refactor my EZ-Script code to Python, and it's an excellent opportunity for me to review and improve its structure.
One of the features I've developed is a logging utility called Log-a-Lot, which assists in debugging, statistics collection, and more.
In EZ-Script, I used the following code snippet:
FileWriteLine($logfilename, $time + ",info,P1," + "Open")
Servo(d4, 1)
WaitForServoMove(d4, 400)
Release(d4)
This code creates a simple CSV file with columns containing specific information.
Previously, I encountered an issue where the log file could become locked or be in use. However, I can now address this in Python using try-except blocks, making my logging process more robust. I'm considering adding a common logging function that handles buffering and avoids locking errors, potentially using libraries like Python's portalocker
.
Before proceeding with the refactoring, I want to ensure I'm not overlooking any built-in features. For instance, the print function in Synthiam ARC automatically adds a timestamp in the debug console. Is there an existing function that collects all debug console output into a time-ordered log file? This way, I could simply print all the necessary information, and it would be automatically recorded in a log file.
Below is some guidance on how to approach this refactor as you transition from EZScript to Python in ARC.
Does ARC Already Log Debug Output to a File?
At this time, while ARC’s builtin print (or its equivalent for your language) does timestamp output in the debug console, there isn’t a builtin ARC function that automatically collects all debug console output into a persistent, timeordered log file. In other words, ARC does not automatically save all of your console prints as a log file for you.
Creating a Custom Logging Function in Python
Since you’re already considering adding a common logging function that handles buffering and deals with file locking (using, for example, Python’s portalocker library), you’re on the right track. Python’s standard library even provides a robust logging module that you can customize. Here’s an example of what such a function might look like:
A Few Points to Consider
Using Python’s Logging Module:
The above example uses Python’s builtin logging module to add timestamps (both to the file and the console). Even though ARC’s debug console automatically adds a timestamp when printing, it doesn’t persist that information in a file.
Handling File Locking:
You mentioned that file locking was an issue in EZScript. The Portalocker library is useful in Python to avoid conflicts if multiple processes (or parallel script invocations) attempt to write to the log file at once.
Central Logging Function:
Having a centralized logging function (like the one above) helps you ensure that all parts of your Python refactored code write to a common log file and that you have consistent formatting and error handling.
Integration with ARC Scripting:
Since ARC supports Python scripting, you can include such a function in one of your common scripts or as part of a custom robot skill. Then, throughout your project, you can call your logging function instead of just print(), ensuring that both the console and file output stay synchronized.
Final Thoughts
Currently, ARC does not offer an automatic log file for the debug output; you’ll need to implement your own using Python’s logging (and optionally file locking) capabilities. This approach will not only replicate what you’ve been doing in EZScript but also give you more control and robustness in your logging process.
Do you have any additional questions or need further suggestions on integrating this custom logging function into your robot’s Python code in ARC?
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Logging is a great idea. However, note that no third-party solutions can prevent errors from a locked file. A file is locked when two or more processes attempt to write to it simultaneously. As you can imagine, that can't happen because it would create a mess. It would be like two people typing a sentence simultaneously with two keyboards on the same computer.
Use the Python command.
If multiple processes run, ensure the file is not open for reading while writing. Also, please do not keep the file open in an editor while it is being written to. You can use baretail or a similar program to monitor the lines appended to it.
Thank you for the feedback, and double checking I am not building code while something is already available.
I will create a (simple) Custom Logging Function in Python, basically try to lock the logfile (basic portalocker functionality) to write a log comment and if not possible retry a couple more times later. It is already an improvement to EZ-Script FileWriteLine (where writing to a locked file caused the script to crash/stop)
Actually I have an idea. Let me have a logging function created. That way it can run from any thread so there will not be any locks. I think it’s useful for something like this.
When dealing with reading and writing files, that’s generally handled in a single thread, or thread safe process. But a logging system would need to be resilient to multi threaded writes.
Thank you.
In EZ-scripting, I had something simple as
that creates a .CSV file with file name/located in $logfilename and 4 columns:
Looking forward to a standard ARC Python logging function!