The International Conference on Informatics and Information Technologies is the 15th of the series of conferences organized by the Faculty of Computer Science and Engineering (FCSE).
Date and venue
April 20-22, 2018, Hotel Bistra, Mavrovo, Macedonia
- Artificial Intelligence, Robotics, Bioinformatics
- Multimedia, Signal Processing
- Computer Networks
- Sensor Networks
- Distributed Systems
- Computer Architecture and Parallel Processing
- Cloud and GRID Computing
- Wireless and Mobile Computing
- Security and Cryptography
- Theoretical Foundations of Informatics
- Applied Mathematics
- eWorld-eWork, eCommerce, eBusiness, eLearning
- ICT in education
- Green ICT
Prof. Ana Sokolova, PhD
Associate professor at University of Salzburg, Salzburg, Austria
Ana Sokolova is an associate professor at the Department of Computer Sciences, University of Salzburg. Before getting a faculty position at the University of Salzburg, she was an Elise Richter Fellow, and a postdoc with Chirstoph Kirsch (University of Salzburg) and with Bart Jacobs (Radboud University Nijmegen). She obtained her PhD at the TU Eindhoven, and her Masters from the Ss. Cyril and Methodius University in Skopje, Skopje, Macedonia. Her research interest vary from theoretical computer science (probabilistic systems, coalgebra, algebra, concurrency theory, formal methods in general) to applied topics like concurrent data structures and memory management.
Title of talk: Local Linearizability
Abstract: The semantics of concurrent data structures is usually given by a sequential specification and a consistency condition. Linearizability is the most popular consistency condition due to its simplicity and general applicability. Nevertheless, for applications that do not require all guarantees offered by linearizability, recent research has focused on improving performance and scalability of concurrent data structures by relaxing their semantics. In this talk, I will present local linearizability, a relaxed consistency condition that is applicable to container-type concurrent data structures like pools, queues, and stacks. We will briefly discuss theoretical and practical properties of local linearizability and its relationship to other existing consistency conditions.
Furthermore, I will show you a generic implementation method for locally linearizable data structures that uses existing linearizable data structures as building blocks. Our implementations show performance and scalability improvements over the original building blocks and outperform the fastest existing container-type implementations.
This talk is on Joint work with Andreas Haas, Andreas Holzer, and Michael Lippautz (Google), Tom Henzinger (IST), Ali Sezgin (formerly U. of Cambridge), Christoph Kirsch (U. of Salzburg), and Helmut Veith (formerly TU Vienna).
Dimitar Nikolov, PhD
Consultant at Altran AB, Malmö, Sweden
Dimitar Nikolov received his Diploma Engineering degree from Ss. Cyril and Methodius University in Skopje, Skopje, Macedonia, in 2008. In 2012, he received the Licentiate Degree from Linköping University, Linköping, Sweden, and in 2015, he received his PhD from Lund University, Lund, Sweden. Throughout his graduate studies, his research focus has been on fault tolerance in real-time systems. Other than fault tolerance, Dimitar’s research interests include: design for testability of digital systems, test of 3D stacked integrated circuits, computer architectures, reconfigurable computing, and digital signal processing. From 2015 to 2017, he has been a postdoc fellow at Lund University, working on challenges in Massive MIMO (Multiple Input Multiple Output), which is one of the key candidate technologies for the fifth generation of mobile networks (5G). His research outcomes have been presented at international conferences and published in international journals. In 2016, he co-authored the paper “A Self-Reconfiguring IEEE 1687 Network for Fault Monitoring”, which received the best paper award at the IEEE European Test Symposium 2016. Currently, he works as a consultant at Altran AB, on an assignment in the product development unit within Ericsson AB, working with advanced antenna systems.
Title of talk: Ensuring correct operation in presence of faults
Abstract: Rapid development in recent semiconductor technologies has enabled manufacturing of electronic devices that provide previously unseen performance. Such devices will be used in applications seen in the 5G roadmap: smart cities, autonomous driving, etc. However, one major drawback of devices manufactured in recent semiconductor technologies is that they have become more susceptible to faults, and thus, they may not always provide correct operation, which is a paramount for applications that require high level of reliability. The aim of this talk is to introduce the field of fault tolerance, which has the purpose of ensuring correct operation even in the presence of faults. This talk covers the common factors that cause devices to fail, how faults are classified, and gives an overview of existing techniques that cope with faults. The talk is going to be more focused on providing fault tolerance for real-time systems. In particular, the fault tolerance technique Rollback Recovery with Checkpointing (RRC) will be discussed in details. One drawback of RRC is that it introduces time overhead, which can be the reason to violate the timing constraints in real-time systems. Since real-time systems are categorized as soft and hard real-time systems, depending on how strict the timing constraints are, the talk will discuss how to optimize RRC for both soft and hard real-time systems.
Mihaela Angelova, PhD
Postdoctoral researcher at French Institute of Health and Medical Research, Paris, France
Mihaela Angelova is a postdoctoral researcher at the Laboratory of Integrative Cancer Immunology at the French National Institute for Health and Medical Research. She obtained her PhD degree in Bioinformatics at the Medical University of Innsbruck, Austria. Her work involves integrative, multi-parametric analysis of heterogeneous datasets, to understand the tumor-immune interplay in solid tumors. Having studied Computer science and Information technologies at Ss. Cyril and Methodius University in Skopje, Macedonia, she applies her multidisciplinary training into the field of Cancer immunology and immunotherapy. She has received the Liechtenstein award for scientific research, Young researcher awards from the President of R. Macedonia, and from the Italian network of cancer biotherapy (NIBIT), as well as a Doctoral dissertation award from the Austrian cancer association.
Title of talk: Squamous lung carcinogenesis: immune evasion before tumor invasion
Abstract: Like most sporadic cancers, lung cancer emerges from premalignant lesions characterized by morphological and molecular changes. Little is known how the immune system shapes this process of carcinogenesis. Characterization of pre-neoplastic and invasive bronchial lesions can provide insights into the mechanisms behind onset and development of squamous cell lung carcinoma, revealing promising new biomarkers for early detection and treatment of lung cancer. We aim to apply an integrative approach on rare datasets to examine the stepwise evolution from premalignant to invasive carcinoma under the influence of the immune system. At 8 different morphological stages of lung squamous carcinogenesis, fresh frozen human bronchial biopsies (n=122) were subjected to gene expression profiling and multispectral imaging from 77 patients. A linear mixed-effects model and weighted correlation networks were applied to infer modules of gene co-expression. Immune cell-specific gene expression signatures were delineated and used to deconvolve the immune response across the different development stages. The spatial distribution of immune cells was analyzed with multispectral imaging.
The results showed distinct trajectories of cancer-intrinsic alterations, as well as evolution of the tumor microenvironment during the multistep process of carcinogenesis. Gene expression revealed temporal order and evolution of cancer hallmarks in squamous lung carcinogenesis. The up-regulation of immune genes in high grade suggested a major role of the surrounding microenvironment through both adaptive and innate immune responses, but also a tumor versus host immunosuppressive response before tumor invasion across the basal membrane.
In conclusion, at a critical stage of carcinogenesis there is a significant modulation of the immune response, which can have implications for cancer prevention and provide insights into tumor biology.
Tome Eftimov, PhD
Researcher at Jožef Stefan Institute, Ljubljana, Slovenia
Tome Eftimov is a researcher at Computer Systems Department, Jožef Stefan Institute, in Ljubljana, Slovenia. In 2011, he obtained his bachelor degree from the Faculty of Electrical Engineering and Information Technologies, Ss. Cyril and Methodius University in Skopje, Macedonia, and in 2013, he received master degree from the Faculty of Computer Science and Engineering, Ss. Cyril and Methodius University in Skopje, Macedonia. In 2018, he received his PhD degree in Information and Communication Technologies at the Jožef Stefan International Postgraduate School, Ljubljana, Slovenia. His areas of research include statistical data analysis, stochastic optimization algorithms, natural language processing, machine learning, and information theory.
Title of talk: Are we aware of the importance of proper study analysis? Deep Statistical Comparison: a case study of meta-heuristic stochastic optimization algorithms
Abstract: Working with experiments, we have models, we run tests, and we get numbers. And whenever we have numbers we need to analyze them, so we invoke a statistical analysis in order to perform a proper study analysis. However, published papers often borrow the analysis found in similar studies in the field, without the required in-depth understanding of what is done. Such approach is prone to making errors out of ignorance, so the results can be misleading. We are going to show most common mistakes found in papers that lead to improper application of statistics, and what would be a proper way of doing statistical analysis. The idea behind this is to understand the area of statistical analysis of experiments, instead of blindly following the established procedures.
All this is important to determine the strengths and weaknesses of a selected algorithm, where its performance should be compared with performances of state-of-the-art algorithms. The idea behind those comparisons is that by using the results obtained on different problems (e.g., functions, data sets), the "best" algorithm (i.e. algorithm that perform best in average over all problems) can be found, or to use the benchmarking results to transfer the knowledge onto a real-world problem. Statistical analyses that are performed in such cases are crucial and need to be made with a great care because they provide the information from where the conclusions are made, so an appropriate statistical analysis should be performed. Nowadays, many researchers have problems in selecting the right statistic that will be applied on a selected performance measure. Additionally, applying the appropriate statistical test requires knowledge of the necessary conditions about the data that must be met in order to apply it. This kind of misunderstanding is all too common in the research community and can be observed in many high-ranking journal papers. For these reasons, we proposed a novel approach for statistical comparison, known as Deep Statistical Comparison, which provides more robust statistical results than previous state-of-the-art approaches when results are affected by outliers or statistical insignificant difference that could exist between data values. An actual demonstration in which the audience will get familiar with a web-based tool (http://ws.ijs.si/dsc/), designed to make a deep statistical comparison easier, will be given at the end of the talk.