| Abstract |
Wells in traditional hydrothermal reservoirs are used to extract heat and to dispose of cooled water. In the first case, high productivity (the ratio of production flow rate to the pressure differential required to produce that rate) to is preferred in order to maximize power generation, while minimizing the parasitic energy loss of pumping. In the second case, high injectivity (the change in injection flow rate produced from a change in fluid injection pressure) is preferred in order to reduce pumping costs. In order to improve productivity or injectivity, cold water is sometimes injected into the reservoir in an attempt to cool and contract the surrounding rock matrix and thereby induce dilation and/or extension of existing fractures or to generate new fractures. Though the increases in permeability associated with these changes are likely localized, by improving connectivity to more extensive high-permeability fractures they can, at least temporarily, provide substantially improved productivity or injectivity. The effects of cold water injection on injectivity have been observed at many sites, and the data demonstrate that changes in injectivity can be relatively large. Gunnarsson (2011), for example measured increases in injectivity in the Hellisheidi field, SW Iceland of more than six times when the water injection temperature was lowered from 120°C to 20°C even while viscosity at the injection temperature would have been ~5 times greater. Grant et al. (2013), in a review of field data, from thermal stimulation tests, demonstrated a nonlinear relationship between injectivity and water injection temperature and noted that field data commonly demonstrate an injectivity increase that is proportional to tn, where t is time and n is between 0.4 and 0.7. |