Note: This website is not updated anymore and may contain outdated information. The new website is at https://www.uregina.ca/science/cs

Computational Learning Theory Lab

The Computational Learning Theory Lab is dedicated to research on the theoretical aspects of machine learning, an area of study often called Computational Learning Theory. The lab is located in Room 122 of the Classroom Building.

The laboratory equipment was purchased with a grant from the Canada Foundation for Innovation (CFI). Additional research support is obtained from the Natural Sciences and Engineering Research Council (NSERC), through its Discovery Grant program, as well as through the Canada Research Chairs program.

Director: Sandra Zilles

The Computational Learning Theory Laboratory currently hosts the following graduate students:

  • Mustakim Al Helal (MSc student)
  • Fahimeh Bayeh (MSc student)
  • Ziyuan Gao (PhD student)
  • Ashwani Kumar (PhD student, co-supervised by Dr. M. Babu)
  • Mohammad Hossein Nikravan (MSc student)
  • Yuan Xue (PhD student, co-supervised by Dr. B. Yang)

Publications in 2016:

  • Levi H.S. de Lelis, Roni Stern, Shahab Jabbari Arfaee, Sandra Zilles, Ariel Felner, Robert C. Holte. Predicting Optimal Solution Costs and Learning Heuristics with Bidirectional Stratified Sampling in Regular Search Spaces. Artificial Intelligence 230:51-73, 2016.
  • Malte Darnstadt, Thorsten Kiss, Hans Simon, and Sandra Zilles. Order Compression Schemes. Theoretical Computer Science 620:73-90, 2016.
  • Ziyuan Gao, Frank Stephan, and Sandra Zilles. Partial Learning of Recursively Enumerable Languages. Theoretical Computer Science 620:15-32, 2016.
  • Hassan Waqar Ahmad, Sandra Zilles, Howard J. Hamilton and Richard Dosselmann. Prediction of Retail Prices of Products Using Local Competitors. International Journal of Business Intelligence and Data Mining 11:19-30, 2016.
  • Ziyuan Gao, Christoph Ries, Hans Ulrich Simon, and Sandra Zilles. Preference-based Teaching. In: Proceedings of the 29th Annual Conference on Learning Theory (COLT) pp. 971-997, 2016.
  • Eisa Alanazi, Malek Mouhoub, and Sandra Zilles. The Complexity of Learning Acyclic CP-nets. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI) pp. 1361-1367. AAAI Press, 2016.
  • Levi H.S. de Lelis, Santiago Franco, Marvin Abisrror, Mike Barley, Sandra Zilles, and Robert C. Holte. Heuristic Subset Selection in Classical Planning. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI) pp. 3185-3191. AAAI Press, 2016.
  • Achilles Beros, Ziyuan Gao, and Sandra Zilles. Classifying the Arithmetical Complexity of Teaching Models. To appear in: Proceedings of the 27th International Conference on Algorithmic Learning Theory (ALT), 15 pages, 2016.
  • Shankar Vembu and Sandra Zilles. Interactive Learning from Multiple Noisy Labels. In: Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery (ECML-PKDD 2016), pp. 493-508, 2016.
  • Yuan Xue, Boting Yang, Farong Zhong, and Sandra Zilles. Fast Searching on Complete k-partite Graphs. To appear in: Proceedings of the 10th Annual International Conference on Combinatorial Optimization and Applications (COCOA). Springer Verlag. 15 pages, 2016.


To Top of Page