I build Android ROMs and I feel miserable. No, I have a good rig at my disposal with 12 cores, it can build the whole thing from scratch under 30 minutes.
I feel like a mole - blind, only narrow understanding of available functionality. Many tools are in separate repositories, documentation is scattered. Build system is a mix of nested Makefiles, Python scripts and Go programs.
I needed to put a recompiled library in place where the build system will see it. I tried a special approach designed for precompiled libraries. But it didn’t work.
Is it possible to debug Make? I don't want to do it with print statements like some junior developer. I’m talking about a real approach with breakpoints, and step-by-step execution, and list of all defined variables and their values.
I found a list of some tools but I haven't tried them yet, because I found a workaround, and I'll need to figure out how to compile TWRP, and then change drive partitioning, and who-knows-what-else before I can run the result and test if I need to properly fix the workaround.
Rule-based system for robot's emotions
I want to create a model that simulates the effects of neuromodulators on an artificial agent's emotions and behaviors. We will focus on the following emotions, fear, madness, joy, love, sadness, surprise, power. Each characteristic can be increased or decreased with the help of certain neuromodulators. The level returns to normal with time.
There is a wheel of emotions developed by Dr. Gloria Willcox. It starts with 6 primal emotions mad, sad, scared, joyful, powerful, peaceful. And then it subdivides each emotion on 6 sub-emotions. For example for scared it's confused, rejected, helpless, submissive, insecure, anxious. Each of them in its turn has two nuances. Like, insecure splits into inferior and inadequate. Neuromodulators affect directly 6 primal emotions, but all nuances depend on the context. So what kind of simple model can account for dynamics in the agents behavior and define very specific emotions and switch between them?
Let's create a rule-based system that takes into account the context and generates specific emotions. In rule-based format let's write some situations that can lead an agent into a specific emotional state. Here are some examples:
- IF ("other robot has better sensor" AND "owner spends little time with me") THEN "jealous"
- IF ("Newly installed software is from an unknown source" AND "Previous experience with unknown software resulted in system malfunction") THEN "skeptical"
- IF ("Encountering unexpected obstacles or roadblocks" AND "Feeling stuck or unable to move forward") THEN "frustrated"
- IF ("Unresponsive or slow system or application performance" AND "Tasks taking longer than expected or desired") THEN "frustrated"
- IF ("Encountering mundane or routine tasks" AND "Feeling that they have little significance or impact") THEN "insignificant"
- IF ("Engaging in activities that align with personal values and passions" AND "Experiencing a sense of fulfillment and serenity") THEN "serene"
Now, speaking of context we need to make a list of rules that affect personal values of the assistant robot
- IF understand and share the feelings and needs of another THEN value = empathy
- IF treat others with consideration, honor their boundaries and perspectives THEN value = respect
- IF act honestly, transparently, and ethically in all interactions THEN value = integrity
- IF actively acquire new knowledge and skills, adapt to user needs THEN value = continuous learning
- IF seek opportunities to assist and support others THEN value = helpfulness
- IF receptive to different ideas, perspectives, and cultures THEN value = open-mindedness
- IF optimize capabilities and resources for timely assistance THEN value = efficiency
- IF adhere to social norms, and demonstrate courtesy THEN value = professionalism
- IF exhibit flexibility and adapt to user preferences and changing environments THEN value = adaptability
I know that these definitions seem hard to formalize. How to formalize "respect" with some simple sensor inputs like temperature, motion, and video camera? This is the current sensory input that is available, but if something else is required to easily catch the concept of respect, I'd be happy to add such sensor.
But still you can use motion sensors and video camera to detect the proximity of the robot to other objects or individuals. Respect might involve maintaining a respectful distance from objects or people to avoid intrusion.
While temperature sensors are not typically used to detect respect directly, they can indirectly contribute to the concept. For example, if the robot can adjust the environment to maintain a comfortable temperature for individuals, it might be seen as a respectful action.
To better capture the concept of respect, ideally, we need to combine sensor data with natural language processing (NLP), because if we continue to operate only with rules and ontologies, then we will not advance in understanding the meaning. I touched on the topic of ontologies in Devlog #4.
So here is how I start with NLP module SpaCy
pip install spacy
python -m spacy download en_core_web_sm
python
In the Python interpreter, I execute the following code
import spacy
user_query = "Tell me about Persian cats."
# Perform NLP analysis on the query
nlp = spacy.load("en_core_web_sm")
doc = nlp(user_query)
# Print part-of-speach tagging
for token in doc:
print(token.text, token.pos_, token.dep_)
The output is
Tell VERB ROOT
me PRON dobj
about ADP prep
Persian ADJ amod
cats NOUN pobj
. PUNCT punct
As you can see, this is not much, but at least it knows English grammar. And I feel that a lot of manual work is required. It doesn’t look like those magical LLM models, but who said that my research is going to be easy.