14:00–14:40 “BBC micro:bit as a data logger”, HAMADA Tatsuyoshi (Nihon University/OCAMI)
15:00–15:50 “LaTeX, Git, and Algebra: An Integrated Approach to Translation (Do Free Documents Dream of Commercial Publication?)”, KIMURA Iwao (Toyama University)
14:00–14:40 “BBC micro:bit as a data logger”, HAMADA Tatsuyoshi (Nihon University/OCAMI)
The BBC micro:bit is a low cost and low power consumption single-board computer. It has built-in sensors for temperature, light, acceleration, magnetism, and sound. This presentation introduces the micro:bit’s functionality as a simple data logger.
15:00–15:50 “LaTeX, Git, and Algebra: An Integrated Approach to Translation (Do Free Documents Dream of Commercial Publication?)”, KIMURA Iwao (Toyama University)
We will share a practitioner’s workflow for translation that leverages the original LaTeX sources
and figure code, utilizing Emacs, pLaTeX/dvipdfmx, latexmk, and Git.
We then reflect on why a translated text matters for learners for whom English is a real hurdle,
and for scientists and engineers outside the field.
Finally, we consider how to balance quality, availability, and sustainability
between “free” and “non‑free” models—no definitive answers promised, but hopefully a gentle nudge to keep the conversation going.
In recent years, generative AI systems such as ChatGPT have been widely used around the world, and most of them are based on deep learning models known as Transformers. Moreover, in the past few years, research on learning symbolic computations using Transformers has become increasingly active. In this talk, we will introduce the work of Kera–Ishihara–Kambe–Vaccon–Yokoyama (2024) on learning Gröbner bases with Transformers, from the perspective of dataset generation. We will also give a demonstration of learning simple symbolic computations using CALT, a library for computer algebra based on Transformers, developed by Kera–Arakawa–Sato.