Low Cost Video Streaming through Mobile Edge Caching: Modelling and Optimization - Archive ouverte HAL Access content directly
Journal Articles IEEE Transactions on Mobile Computing Year : 2019

Low Cost Video Streaming through Mobile Edge Caching: Modelling and Optimization

(1, 2) , (1) , (2)
1
2

Abstract

Caching content at the edge of mobile networks is considered as a promising way to deal with the data tsunami. In addition to caching at fixed base stations or user devices, it has been recently proposed that an architecture with public or private transportation acting as mobile relays and caches might be a promising middle ground. While such mobile caches have mostly been considered in the context of delay tolerant networks, in this paper we argue that they could be used for low cost video streaming without the need to impose any delay on the user. Users can prefetch video chunks into their playout buffer from encountered vehicle caches (at low cost) or stream from the cellular infrastructure (at higher cost) when their playout buffer empties while watching the content. Our main contributions are: (i) to model the playout buffer in the user device and analyze its idle periods which correspond to bytes downloaded from the infrastructure; (ii) to optimize the content allocation to mobile caches, to minimize the expected number of non-offloaded bytes. We perform trace-based simulations to support our findings showing that up to 60 percent of the original traffic could be offloaded from the main infrastructure.
Fichier principal
Vignette du fichier
main.pdf (604.14 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01855304 , version 1 (07-08-2018)

Identifiers

Cite

Luigi Vigneri, Thrasyvoulos Spyropoulos, Chadi Barakat. Low Cost Video Streaming through Mobile Edge Caching: Modelling and Optimization. IEEE Transactions on Mobile Computing, 2019, 18 (6), ⟨10.1109/TMC.2018.2861005⟩. ⟨hal-01855304⟩
206 View
285 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More