I’m trying to determine if an element is partially or fully in the viewport.
I’ve found this which will determine if an element is fully in view but kept getting confused when trying to determine partial visibility. I don’t want to use jQuery.
Basically, the idea is that there will be an element on the page that could be out of view. Once the user scrolls that element into view, even partially, it should trigger an event. I’ll handle the event trigger by binding an onscroll event. I just need the detection to work properly.
function isInViewport(element) { var rect = element.getBoundingClientRect(); var html = document.documentElement; return ( rect.top >= 0 && rect.left >= 0 && rect.bottom <= (window.innerHeight || html.clientHeight) && rect.right <= (window.innerWidth || html.clientWidth) ); }
Any help would be greatly appreciated!
Advertisement
Answer
Late answer, but about a month ago I wrote a function that does exactly that, it determines how much an element is visible measured in percent in the viewport. Ive tested it in chrome, firefox, ie11, ios on iphone/ipad. The function returns true when X percent (as a number from 0 to 100) of the element is visible. Only determines if the measurements of the element are visible and not if the element is hidden with opacity, visibility etc..
const isElementXPercentInViewport = function(el, percentVisible) { let rect = el.getBoundingClientRect(), windowHeight = (window.innerHeight || document.documentElement.clientHeight); return !( Math.floor(100 - (((rect.top >= 0 ? 0 : rect.top) / +-rect.height) * 100)) < percentVisible || Math.floor(100 - ((rect.bottom - windowHeight) / rect.height) * 100) < percentVisible ) };