If the data is like: –
const dis ={ "data":[ { "Hazard_type": ["Tornado","Hurricane"], "County": "Anderson", "State": "TX", "FIPS_code": 48001, "Longitude": -95.687072, "Latitude": 31.776143, "Property_Damage": 10000000, "Crop_Damage": 0 }, { "Hazard_type": ["Hurricane"], "County": "Anderson", "State": "TX", "FIPS_code": 48001, "Longitude": -95.687072, "Latitude": 31.776143, "Property_Damage": 4914933.84, "Crop_Damage": 0 }, ] }
And I want to create another json array which has aggregate damage per unique tag in the Hazard type. What should be the approach here? (Very new to Javascript)
Advertisement
Answer
Here’s one way you might accomplish the task, making use of Array.filter and Array.reduce.
Note that I changed the values of Crop_Damage from zeros to 3 and 1, to make it more evident the code works.
const dis ={ "data":[ { "Hazard_type": ["Tornado","Hurricane"], "County": "Anderson", "State": "TX", "FIPS_code": 48001, "Longitude": -95.687072, "Latitude": 31.776143, "Property_Damage": 10000000, "Crop_Damage": 1 }, { "Hazard_type": ["Hurricane"], "County": "Anderson", "State": "TX", "FIPS_code": 48001, "Longitude": -95.687072, "Latitude": 31.776143, "Property_Damage": 4914933.84, "Crop_Damage": 3 }, ] }; const removeDuplicates = (key, index, array) => { return array.lastIndexOf(key) === index; }; const distinctHazards = dis.data.map(row => row.Hazard_type).flat().filter(removeDuplicates); /* * Array.filter() ensures we only examine the subset of the array having to do with one Hazard at a time * Array.reduce() is an accumulator that simply sums the fields (Crop_Damage) up. */ const scores = distinctHazards.map(hazard => { const damages = dis.data.filter(row => { return row.Hazard_type.includes(hazard); }); return {hazard, damages: damages.map(row => row.Crop_Damage).reduce((a,b) => { return Number(a) + Number(b); })}; }); console.log(scores);