Karim AEBISCHER

Karim AEBISCHER

Hello everyone, I graduated with a Master degree in Computer Science in 2022, I have experience in developing innovative solutions that bridge technology and accessibility. My work includes the development of an Augmented Reality (AR) platform aimed at rehabilitating visually impaired individuals, incorporating performance evaluation metrics and usability testing. I co-designed AR rehabilitation tasks and conducted user experience evaluations to refine the platform.

I contributed to the development of the GamesHub digital platform, a project led by the Centre de Recherche sur l'Enseignement/Apprentissage par les Technologies numériques (CRE/ATE) at the Haute École pédagogique Fribourg (HEP-FR) in collaboration with the IT department of the École des Métiers de Fribourg (EMF).

2026

Scientific Collaborator

Jan 2025 - Today

I currently working at the University of Fribourg for the Human-IST Institute. My task is to guide and manage the infrastructure for the SAIS research group to propose tools and services for the researchers (more info).

2025

Scientific Collaborator

Jan 2025 - Dec 2025

I worked as a scientific collaborator at the Haute Ecole Pédagogique Fribourg for the UR PL&M research group. My task was to design and develop a game that integrates a mathematical statements generator (more info).

2024

Junior Researcher

Mars 2023 - Dec 2024

I worked as a junior researcher at the University of Fribourg for the Human-IST research group. My task is to design and develop a rehabilitation task in augmented reality for visually impaired people (more info).

2023

Scientific Collaborator

Feb 2023 - June 2023

I worked as a scientific collaborator at the University of Fribourg for the Master course of Design & Graphics Programming for Game Development under the supervision of Dr Maurizio Rigamonti (more info).

2022

MSc in Computer Science

2018 - 2022

I obtained a Master degree in the Swiss Joint Master of Science in Computer Science through the study program offered by the universities of Bern, Neuchâtel, and Fribourg. My Master's thesis was in collaboration with a neuroscience research group at the University of Fribourg (more info).

2020

Administrative and computer support

Feb 2020 - Aug 2020

During my second civilian service, I had the opportunity to work in a very stimulating institution for the elderly. Since the institution was very well equipped in terms of technology, I had the chance to help and improve some of their workflows during the time of the Covid-19 pandemic. Among other things, I have developed a module for making web appointments with Wordpress during this period.

2019

Cafeteria assistant

Sep 2019 - Feb 2020

During my first civilian service, I worked in institution for people with disabilities. I had the opportunity to follow two training courses which are: "Communication and Support" and "Assistance to People with Disabilities" (more info).

2018

BSc in Computer Science

2014 - 2018

I obtained a Bachelor degree in Computer Science with a minor in Business Informatics at the University of Fribourg. My Bachelor's thesis was on the development of a web extension (more info).

2017

Volunteer at Fribourg Olympic

Oct 2014 - Jun 2017

I offered my help for the bar service during basketball season of the Fribourg Olympic team with a fun and hardworking bar team.

Rclone Manager Interface

Work/Personal Project

  • Python

A robust GUI application designed to streamline Rclone configurations and operations, offering key features for seamless cloud storage management:
- Sync Jobs: Schedule and manage file synchronization tasks across your cloud storage providers
- Auto-Mounts: Automatically mount remote storage, making it accessible as a local drive
- Auto-Encrypt Files: Secure your files with automated encryption, ensuring data privacy at all times
Built to enhance productivity and security for users relying on cloud storage solutions.

AR platform for rehabilitation

Work Project

  • Unity Engine
  • C#
  • Python

Development of an augmented reality platform for the rehabilitation of visually impaired people (due to eye conditions or cognitive disorders). Development of a system, including metrics for performance evaluation (e.g. pointing, usability). Co-design of AR rehabilitation tasks to be implemented on the platform. User experiences to evaluate the above mentioned platforms.

An iOS Budget Tracker

Personal Project [WIP]

  • Swift

Having always wanted to discover the swift programming language, I recently started a small personal project. I wanted to have a simple and intuitive application to track expenses whether they are recurring or not. This iOS application will eventually be available on the AppStore for anyone who finds it useful.

A Gamified Mobile Application for Approach-Avoidance Tasks

MSc Thesis

  • Unity Engine
  • C#
  • Python

Gamification of a cognitive task named Approach-Avoidance Task (AAT) to measure cognitive biases. The application was developed with Unity engine for smartphone devices. Two prototypes of AAT for smartphone were developed. A so-called traditional variant based on pre-existing designs found in the literature. Another variant, called gamified, where game design elements were added to make the task more interesting and engaging. An evaluation of the two prototypes with 25 participants was also handled. We compared the emotions (pleasure, arousal, dominance), the experience (competence, sensory & immersion, challenge) and the performance (reaction times, accuracy, approach bias) of the participants between the two prototypes. This project was in collaboration with a neuroscience research group at the University of Fribourg.

Dark & Night Modes: In Combination with Light Environment

University Seminar

  • LaTeX

During a seminar of the HCI group of the Human-IST Institute at the University of Fribourg, I had the chance to write a small literature review based on a topic that I particularly like: Dark-mode or night-mode in real settings. The aim of the HCI seminar is to look at this question and to synthesize to state-of-the art research in Human-Computer Interaction.

Party4Glory

University Project

  • Unity Engine
  • C#

It is a party-game for 4 players using the Unity engine. It was composed of 9 arcade-style mini-games playable on PC with keyboard and/or gamepad support. It was then kept under development and presented at a competition named SGDA Junior Swiss Game Award where it was selected for the nomination (more info). After that, it was also show during an event handle at the University of Fribourg called the FriLan (more info). The last public appearance of the game was during the event called Explora at the University of Fribourg in order to show what can be done in Computer Science for the future students (more info).

RESTful API / Containers Automatization / Serverless API

University Project

  • Python/Flask
  • Docker
  • Kubernetes
  • AWS PaaS/FaaS

In order to approach the most recent technologies in Cloud Computing, we had a very interesting project. We made a RESTful application written in Python/Flask and Docker to set up automatically every containers for web services, web services were also deployed in Google Cloud using Kubernetes and DockerHub and we also deployed a Serverless API using AWS PaaS/FaaS.

Various Concurrent Programming Problems

University Series

  • Java

I followed an interesting course named "Concurrency - Multi-core Programming and Data Processing" in which we have various concrete concurrent problems that we need to solve during the semester. Here are some that I found very fun to solve: Producer-consumer problem using a shared circular buffer, the Dining Savages and Philosophers problems, an Edge Images Smoother and Blurred Images algorithms. Of course, all problems solution were solved in multithreaded mode and performance comparisons were made.

Web Extension for Hypermedia Adaptability

BSc Thesis

  • JavaScript
  • HTML5
  • CSS

A Firefox Add-ons using the WebExtensions API cross-browser technology. The extension was developed using JavaScript, HTML5 and CSS. The HyperFocus extension, its name, can be activated on any web page and allows the user to consult four types of hypermedia (text, images, videos/music, hyperlinks) in isolation.

Development of a distributed system for dictionary attacks using the Erlang programming language

University Project

  • Erlang
  • Python
  • Shell Scripting

The objective was to solve this problem by parallelizing the hash computation as well as possible, while taking into account the variance of the hardware capacities of the different machines of the cluster, the management of failures at different levels of the system and the synchronization of the progress of the resolution of the problem of each machine.

Real-Time Object Detection and Augmented Reality to Support Low-Vision Navigation and Object Localization: A Demonstration

Work Publication

  • Unity Engine
  • C#

Low-vision (LV) people often face difficulties in navigating complex environments and recognizing objects due to reduced visual acuity, contrast sensitivity, or field of view. In this paper, we present a prototype that combines real-time object detection with augmented reality (AR) visualization to enhance spatial awareness for LV users, supporting safe navigation and object localization in indoor spaces. The system integrates an RGB-D camera and a Microsoft HoloLens 2 headset via ROS and Unity, using a YOLOv11-based perception pipeline to detect and localize static and dynamic objects in 3D and display high-contrast AR icons above the detected objects within the user’s field of view, conveying both their position and identity.

Enhancing Therapist-Guided Low-Vision Training with Projected Gaze Behaviors in Co-Located Shared AR. In ACM Symposium on Spatial User Interaction

Work Publication

  • Unity Engine
  • C#

Low-vision (LV) individuals often face challenges with visual search and object recognition due to partial loss of visual function. While rehabilitation centers help clients develop compensatory strategies using residual vision, interpreting clients’ visual behavior, especially in spatially rich contexts or when using head movements for eccentric viewing, can be difficult. This paper explores the use of co-located shared augmented reality (S-AR) with projected gaze cues (eye gaze, head gaze, eye trace, and field of view) to support LV therapist (LVT)-guided training. Built on the Microsoft HoloLens 2, our system projects these cues in real-time within an S-AR environment. Grounded in a field observation and discussions with two certified LVTs, we designed an AR-based visual search task inspired by current rehabilitation practices. Through an exploratory study with nine LV clients and LVTs, we assessed the potential of gaze projections to support guided training in the future. Our findings highlight the system’s potential to enhance LVTs’ understanding of client strategies, engage clients in training tasks, and facilitate more personalized guidance, while pointing out areas for further refinement in cue interpretability and system adaptability.

Exploring Shared Augmented Reality for Low-Vision Training of Activities of Daily Living. In The 27th International ACM SIGACCESS Conference on Computers and Accessibility

Work Publication

  • Unity Engine
  • C#

Mixed reality, particularly virtual reality (VR), has shown potential for low-vision (LV) training by providing safe environments to practice orientation and mobility skills. However, VR offers limited support for natural co-located interaction with low-vision therapists (LVTs), and its application to training activities of daily living (ADLs) remains underexplored. This paper investigates the potential of co-located shared augmented reality (S-AR) to support LVT-guided LV training through spatialized, interactive tasks. Using a custom AR platform developed for the Microsoft HoloLens 2, we collaborated with two LVTs to iteratively design tasks inspired by their existing training practices. Following two feasibility studies with two LV participants, we conducted an exploratory evaluation with eight additional LV participants to gather preliminary insights into accessibility, usability, and engagement. While participants generally found the tasks enjoyable and engaging, their feedback highlighted the need for adaptable designs to accommodate diverse visual needs. The LVTs valued the platform’s potential for real-time observation and saw promise in its use as a motivating, customizable complement to existing training methods.