I am new to coding and currently touch the javascript part. I am facing a question about how to calculate the different tier prices and charges as shown in the image: Tier Pricing Table
This is the code that I have tried. i
represents the LOAN AMOUNT tier. When i
is equal to 1 it will use the next tier percentage to count and the i will +1 ag, but I am not sure is it correct or not… And should I add a EventListener to determine which tier the input now is? Sorry for my broken English. Thanks a lot! ^_^
if(loan >= 500000 && loan < 1000000 && i==0) charge = 1% i = 1
if(loan >= 500000 && loan < 1000000 && i==1) charge = 0.8% i = 2
if(loan >= 2000000 && loan < 2000000 && i==2) charge = 0.7% i = 3
if(loan >= 2000000 && loan < 2000000 && i==3) charge = 0.6% i= 4
Advertisement
Answer
Not quite sure what your goal is, but if you want a system like tax brackets, where the loan amount is charged in portions based on the remaining money and where it fits in a tier it might look a little like this:
const getChargedSums = (amount) => { let loanAmount = amount; let sums = []; loanAmount -= 500000; if (loanAmount < 0) { sums.push((loanAmount + 500000) * 0.1) return sums } sums.push(500000 * 0.1) loanAmount -= 500000; if (loanAmount < 0) { sums.push((loanAmount + 500000) * 0.08) return sums } sums.push(500000 * 0.08) loanAmount -= 2000000; if (loanAmount < 0) { sums.push((loanAmount + 2000000) * 0.07) return sums } sums.push(2000000 * 0.07) loanAmount -= 2000000; if (loanAmount < 0) { sums.push((loanAmount + 2000000) * 0.06) return sums } sums.push(2000000 * 0.06) loanAmount -= 25000000; if (loanAmount < 0) { sums.push((loanAmount + 25000000) * 0.05) return sums } sums.push(25000000 * 0.05) sums.push(loanAmount * 0.05) return sums } let chargedSums = getChargedSums(800000) console.log('Charged:', chargedSums.reduce((a, b) => a + b, 0))