Identifying Artifacts Resolved from Bintray(JCenter)

A brute force method (a.k.a. hack) of detecting which artifacts are being resolved from a Bintray repositories in your Gradle project.

The Context

If you haven’t heard, JFrog is sunsetting Bintray, and which means Bintray repositories (and jcenter) are going away by May, 1st 2021 😱. What is Bintray you ask?

That video doesn’t really help .. in plain English, Bintray is a repository a company can use to host libraries for private/public use. jcenter is one of those repositories, which is a “public” Bintray repository managed by Jfrog. Whats a Jfrog? Well in a small jungle in the middle of Brazil there lies an elusive frog ….. just kidding.

The Gradle build system relies on jcenter, among other repositories, to fetch .aar and .jar files and include them in your project’s classpath. You probably have seen jcenter declared in your Andorid projects:

The problem here is jcenter() will eventually fail to resolve any dependency. If a dependency relies solely on a this repo (or any bintray repo) and is not moved to different repository (e.g. mavenCentral) then your build will fail to resolve that dependency and fail 💥. Even if you have no dependencies on jcenter() and still have it declared, it’s needless search that doesn't need to be there.

Outside of re-hosting these dependencies in your own repo, there is no solution you (the developer) can take to fix this issue yourself. However, if you have an easy way to detect which repos need to be migrated then we can take that information and alert the library publishers to take action. There are already some issues out in the wild that do just that.

Using Gradle to tell us where dependencies are coming from

If you were expecting a simple gradle task (e.g. ./gradlew app:dependencies --configuration debugRuntimeClasspath -PtellMeWhatsWrong ) to tell you which dependency comes from where … I have some bad news for you. I did some light digging and and I found that .. thats not really a thing. We could just remove jcenter() from the repositories blocks in our root build.gradle , resync our project and wait for the failures to come in … but that wouldn’t be any fun and I wouldn't have a reason to write about this. If the dependencies are already cached then removing the repos wont have any affect either until the cache expires. We’d need to nuke the gradle caches and let it refetch _all the things_ which … I dont recommend unless you really have to (shout out out to rock3r/deep-clean for that purpose).

So why do I call this the brute force method? Well, we’re basically going to get a list of all our dependencies and ping each repo to see which artifacts comes from where ... it ain’t pretty … but I wasn’t my high school homecoming king either so here we are.

The nuts and bolts (a.k.a hack)

BuildEnvironmentReportTask and DependencyReportTask do a great job us tell us what our dependencies are and for which configuration. If you run ./gradlew buildEnvironment you should see something like the following:

This tasks lists our dependencies we declared, along with each dependency’s transitive dependencies for the classpath configuration. These dependencies come from the buildscript block where we declare all of our Gradle Plugins.

The DependencyReportTask is similar in that it prints the dependencies grouped by configuration (e.g. debugRuntimeClasspath, lintClassPath, etc..) for each module in your project. This task can be run via ./gradlew dependencies and will spit out all configurations with their dependencies, or you can specify which configuration you’d like to see using --configuration .

The idea here is to create two tasks that each extend from BuildEnvironmentReportTask and DependencyReportTask . These two tasks can set a renderer which will spit out all the resolved dependencies per each configuration. We can use the Renderer as a hook in order to store a list of unique dependencies and proceed on from there.

From there we iterate through uniqueDependencies and make a network call to see if the the path exists or not per repo. The URL we generate uses the POM file in the version directory. This file should be there for most versions which avoids a 404 if version directory is not readable (as is what happens w/ google). We could also use the AAR but there repo could just have a pom file or could be a jar. You could also check if the path exists, then pom exists, aar, and lastly jar … but thats a lot of network calls to fail on. Relying on pom gets us 95% of the offending repos.

After we are done with all of our checks, we can iterate through our map and figure out which artifacts belong to which bintray repos.

Alternate Solutions

  • A.) Just remove the repos and let Gradle tell you what is wrong
  • B.) Nuke the Gradle cache, run a gradle command with --info , grep the output to see what artifact is downloaded and from where.
  • C.) Please see point A

The End

Not all repos are made the same so our script will get wonky because … but not limited to:

  • Version was published without a pom
  • Using exclusiveContent to declare repos, the script isn't smart enough to take them into account and just treats them as any other repo. It would definitely save a bunch of unnecessary checks and speed it up a bit… but I couldn’t figure how to access the inclusive content filter.
  • Your company repo is only accessible behind a VPN. You’ll need to connect first.
  • My poor coding

Thank you for coming to my ted talk. You can view all code here:

Senior Software Engineer @nytimes

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store