Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark

Philip Munksgaard*, Svend Lund Breddam, Troels Henriksen, Fabian Cristian Gieseke, Cosmin Oancea

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

2 Citations (Scopus)
31 Downloads (Pure)

Abstract

Functional languages allow rewrite-rule systems that aggressively generate a multitude of semantically-equivalent but differently-optimized code versions. In the context of GPGPU execution, this paper addresses the important question of how to compose these code versions into a single program that (near-)optimally discriminates them across different datasets. Rather than aiming at a general autotuning framework reliant on stochastic search, we argue that in some cases, a more effective solution can be obtained by customizing the tuning strategy for the compiler transformation producing the code versions. We present a simple and highly-composable strategy which requires that the (dynamic) program property used to discriminate between code versions conforms with a certain monotonicity assumption. Assuming the monotonicity assumption holds, our strategy guarantees that if an optimal solution exists it will be found. If an optimal solution doesn’t exist, our strategy produces human tractable and deterministic results that provide insights into what went wrong and how it can be fixed. We apply our tuning strategy to the incremental-flattening transformation supported by the publicly-available Futhark compiler and compare with a previous black-box tuning solution that uses the popular OpenTuner library. We demonstrate the feasibility of our solution on a set of standard datasets of real-world applications and public benchmark suites, such as Rodinia and FinPar. We show that our approach shortens the tuning time by a factor of 6 × on average, and more importantly, in five out of eleven cases, it produces programs that are (as high as 10 × ) faster than the ones produced by the OpenTuner-based technique.

Original languageEnglish
Title of host publicationTrends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers
EditorsViktoria Zsok, John Hughes
PublisherSpringer Science and Business Media Deutschland GmbH
Publication date2021
Pages3-23
ISBN (Print)9783030839772
DOIs
Publication statusPublished - 2021
Event22nd International Symposium on Trends in Functional Programming, TFP 2021 - Virtual, Online
Duration: 17 Feb 202119 Feb 2021

Conference

Conference22nd International Symposium on Trends in Functional Programming, TFP 2021
CityVirtual, Online
Period17/02/202119/02/2021
SeriesLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12834 LNCS
ISSN0302-9743

Keywords

  • Autotuning
  • Compilers
  • Flattening
  • GPGPU
  • Nested parallelism
  • Performance

Cite this