![]() |
There is a problem with the vast number of 416,2 million weather-datasets i have to generate to cover the Atlantic area in the time period of Sep'39-Mai'45.
The Atlantic with adjacent seas (black-sea, hudson-bay, north-sea, baltic-sea etc.) has an area of 106 Mill Km^2. If i go for 1 dataset (waveheight/cloudcover/precipitation/windspeed/winddirection/kind of precipitation etc.) of each longitude/latitude for each hour of the Atlantic i have to generate 8500 datasets for 1 hour to cover the whole Atlantic area + adjacent seas. 1 dataset covers an area of 12500 km^2 (= area of 112km x 112 km). 8500 datasets x 24 hours x 30 days x 68 months = 416,2 Mill datasets I would accept 7.5 million datasets because 7.5 million datas generate a mod-size of ca. 150 MB (one weather-dataset is 20Byte long, and the dataset is already in packed format). I dont accept a mod-size of 1 GB or so. So my goal is to reduce 416,2 million datasets to max 7.5 million datasets. Any thoughts? Quote:
|
All times are GMT -5. The time now is 04:14 PM. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 1995- 2025 Subsim®
"Subsim" is a registered trademark, all rights reserved.