-the church gave us the bible
-the church helps us understand it
-the church is preaching the gospel to all nations
The idea that the church gave us the Bible does not lead directly to the idea that the church was fully cognizant of what it was doing. God has used institutions of the world to further His purposes before and since. He wanted His people to be able to leave Babylon so He made the moves with Cyrus the Persian, Cyrus not knowing that he was operating as per God's designs.
The church doesn't help anybody understand the Bible. The Vatican NEVER says, "This is what we'll do because the Bible says this or that." The church does things directly against what the Bible says, and they aren't concerned because they have taught the people that
tradition is more important than
Biblia. Even very clear teachings of Jesus are overstepped by the church.
"Call no one on earth your father; you have but one Father in heaven." (Matthew 23:9,
New American Bible)
Very clear. Don't call any man "Father," meaning your
spiritual father, of course. This is ignored by the church, and nobody notices because the church doesn't teach what the Bible says.
"Now the Spirit explicitly says that in the last times some will turn away from the faith by paying attention to deceitful spirits and demonic instructions through the hypocrisy of liars with branded [numb] consciences.
They forbid marriage and require abstinence from foods that God created to be received with thanksgiving." (I Timothy 4:1-4,
NAB)
Does the church forbid priests to marry? Is that completely overstepping the teaching of the Bible? Yes, but tradition comes before Bible teaching. Look up the history of priests being forbidden to marry. It all had to do with wealth, and if priests married, upon their demise their wealth would go to their wives and not the church.
These are but a couple examples of what I'm talking about.
Now, you say that the church "is preaching the gospel to all nations." What actually IS "the gospel"? Do you know Jesus' words at
Matthew 24:14? Have you been taught what Jesus actually says? The "gospel" is
the good news of the KINGDOM. Your Catholic Bibles ALL say this:
"And this gospel of
the Kingdom will be preached throughout the world as a witness to all nations, and then the end will come." (NAB)
Have you been taught what the Kingdom is? Perhaps there has been mention of it being "in your heart"? That's not what the Bible teaches. It says that God's Kingdom is a real
GOVERNMENT that will take control of the whole world, destroying all of men's governments. (Daniel 2:44; Isaiah 9:6,7; Revelation 19:11-21) Can you say that this gospel of the Kingdom has been taught throughout the world? I have NEVER heard any pope utter the word "Kingdom," never mind what it actually is. The church's political buddies wouldn't like that, would they?
So, no, the church doesn't teach what the Bible says, and it has not taught anyone about the gospel of the Kingdom.