OnionCity Search the Dark Web without Tor

Onion.City: The New Way to Scour the Dark Web without Tor

One software developer has built a new way to search the deep dark web of .onion domains without the use of the Tor browser bundle, while offering the same security and safehaven.

The dark web, you may have heard of it, the underground internet comprised of illicit services that requires an extra layer of anonymizing tools just to access. Okay, it’s not quite that bad, but the underground typically known as the deepweb, dark net, or .onion domain is officially referred to as the dark web. Often referred  to as the “hidden internet,” the subinternet offers everything from drug marketplaces, to hidden Wiki’s all the way to hiring a hitman online.

While the dark web may be crawling with a list of services, it also has a lot to offer. Especially for heavily censored nations, Tor browser, the anonymizing tool needed to access the dark web, can open the walls behind the censorship. The Tor browser offers anonymity and security to browse the internet without the fear or government intrusion and censorship.

Though the dark web is a slightly well-known area, it remains hidden behind Tor nodes and the general dark web is harder to find. As the dark web domains don’t appear on Google or other commercial search engines, it may be hard to enter the initial dark web , let alone navigate throughout it if you end up on it. Originally to end up on the dark web, you may have to go scouring the web for a .onion link, then boot up Tor and copy and paste it into the browser bar.

As the dark web is hidden and no real dark web search engines exist, someone changed that.

Welcome Onion.City, a new way to search the dark web with a Google-like search engine, alongside not even needing Tor to properly browse the dark web.

Developed by programmer, Virgil Griffith, OnionCity is the new way to search the dark web with a Google-type search engine. The search engine indexes the dark web just like Google, allowing users to search for their desired search terms, and OnionCity will search through its database, and return all the results it was able to gather.

The search engine is powered using Tor2web proxy, a project which enables regular Internet users to access .onion domains and allows the search engine to dig into .onion domains whilst on the Tor anonymizer network. The image below explains the concept.

Google and Onion.City

The first search is on Google, below the black line is a search on OnionCity.

As you can see, Google returns results to Wikipedia and their .org domain, while OnionCity returns addresses to .onion domains, behind its own OnionCity domain.

Griffith has essentially taken the need for the Tor browser out of the picture, but the project is far from complete. As of the time of writing this article, OnionCity has only indexed approximately 664,000 sites, according to Google’s site:onion.city search.

Powered by the Tor2web proxy, the software acts as a middleman for the regular web and Tor network. The only difference browsing sites with Onion.City is the results will appear with a .city suffix at the end instead of the usual .onion. The above image is a prime example, searching for a Wiki will leave you with results such as xxxx.onion.city, rather then xxxx.onion.

Once you click a search result, you will be shown the .onion domain but be redirected to a .city subdomain where the contents of that webpage will exist.

As the dark web can harvest a lot of illicit material such as child pornography and black markets, to comply with United States law and Internet regulations, OnionCity has blacklisted a number of domains from appearing in their search results. Not all domains may be bad, some may just choose to not be indexed and shown commercially. Griffith has an disallowed page, where results will not be returned for sets of domains.

Griffith’s search engine is defiantly an innovation in dark web computing, but far from complete or a substitute to the Tor network. Currently OnionCity roams on an insecure http:// connection, meaning web traffic may be able to be intercepted and read. Griffith has assured those concerned that the site is working to gain a valid SSL certificate. Many have also voiced concern over the search engines use of Google to help index the entire site and dark web. Google is not fond of privacy and has been known to infringe on those rights. Griffith said he is aware the use and direct connection to Google is suboptimal, but see’s the search giants help as a “temporary-ladder just to get the ball rolling.”

Aside from concerns, Griffith’s new project can help shed light on the not so dark and deep web we have today.

Image courtesy of OnionCity.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *


  1. Maybe using Google to start with is a good thing. Hopefully it can remove the scum who create and use child pornography specifically. I’d like to venture into this and have a look, but I don’t want to stumble across this exploitation while exploring a new field. It’s these filthy animals that are ruining privacy in the first place, by giving a valid excuse for non-privacy.

  2. I’m a “programmer”? Is that a step down or up from my older title of “hacker” ?

    Regardless, glad you like OnionCity!

    1. Hi Virgil, we had saw you had developed the system from scratch on the forum thread so we labeled you a programmer. I think both titles are suitable and not a step up or down from each other.


    2. These days it’s not safe to call yourself a hacker since the legal system doesn’t understand that hacking is not all bad. I can hear the Federal Attorney’s cross examination of you: “Isn’t it true, Mr. Griffith, that you call yourself a hacker?” No matter how you explain it, the Judge jury won’t understand; they just know hackers are bad people who steal data and money.