MASCARA-FPGA cooperation model: Query Trimming through accelerators - Archive ouverte HAL Access content directly
Conference Papers Year :

MASCARA-FPGA cooperation model: Query Trimming through accelerators

(1, 2, 3) , (1, 3) , (4) , (2, 3)
1
2
3
4

Abstract

The use of Field Programmable Gate Arrays (FPGA) has become attractive in recent years to accelerate database analysis. Meanwhile, Semantic Caching (SC) is a technique for optimizing the evaluation of database queries by exploiting the knowledge and resources contained in the queries themselves. Organizing SC on FPGA is relevant in terms of response time and quality of results to increase system performance. To make SC scalable on FPGAs, we have proposed a ModulAr Semantic CAching fRAmework (MASCARA) in which relevant stages or modules could be convertible as accelerators on FPGAs. Therefore, in this paper, we aim to present a complementary query processing platform based on the cooperation model between MASCARA and FPGA. This novel approach extends the advantage of the classical SC, which is mainly based on Central Processing Unit (CPU), by offloading computationally intensive phases to FPGA. Moreover, MASCARA-FPGA presents the workflow of query rewriting and partial query execution in a pipelined execution model where multiple accelerators can run in parallel. In our experiments, the Query Trimming can reduce the response time by up to 3.96 times with only one accelerator used.
Fichier principal
Vignette du fichier
ssdbm2021_HAL.pdf (1.01 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03503635 , version 1 (28-12-2021)

Identifiers

Cite

van Long Nguyen Huu, Julien Lallet, Emmanuel Casseau, Laurent d'Orazio. MASCARA-FPGA cooperation model: Query Trimming through accelerators. SSDBM 2021 - 33rd International Conference on Scientific and Statistical Database Management, Jul 2021, Tampa, United States. pp.203-208, ⟨10.1145/3468791.3468795⟩. ⟨hal-03503635⟩
56 View
95 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More