The Blind or Visually Impaired (BVI) individuals use haptics much more frequently than the healthy-sighted in their everyday lives to locate objects and acquire object details. This consequently puts them at higher risk of contracting the virus through close contact during a pandemic crisis (e.g. COVID-19). Traditional canes only give the BVIs limited perceptive range. Our project develops a wearable solution named Virtual Touch to augment the BVI's perceptive power so they can perceive objects near and far in their surrounding environment in a touch-free manner and consequently carry out activities of daily living during pandemics more intuitively, safely, and independently. The Virtual Touch feature contains a camera with a novel point-based neural network TouchNet tailored for real-time blind-centered object detection, and a headphone telling the BVI the semantic labels. Through finger pointing, the BVI end user indicates where he or she is paying attention to relative to their egocentric coordinate system, based on which we build attention-driven spatial intelligence.