Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - astor

Pages: 1 ... 51 52 [53] 54 55 ... 208
781
Off topic / Re: Hey, come chat with us!
« on: July 08, 2013, 07:33 am »
Ok, both of you are on Tails. The problem is easy to fix. Change the proxy settings to the "Gnome default" or however it's worded. Don't use the Tor/Privacy option.

Accounts -> Manage Accounts -> <chat service name> -> Modify -> Proxy tab

782
Off topic / Re: Hey, come chat with us!
« on: July 08, 2013, 06:48 am »
Um, I must be retarded or something. Can't get it to work :(

Keep getting this message: Unable to connect: Resource temporarily unavailable

Suggestions?

I need some more info.

Open Help -> Debug Window. When I connect to the silcroad server, I see these lines (with some omitted):

account: Connecting to account astor@silcroadg3c3mtu6.onion.

dnsquery: Performing DNS lookup for 127.0.0.1
dnsquery: IP resolved for 127.0.0.1

proxy: Attempting connection to 127.0.0.1
proxy: Connecting to silcroadg3c3mtu6.onion:6667 via 127.0.0.1:9150 using SOCKS5

socks5 proxy: Connection in progress
socks5 proxy: Connected.
socks5 proxy: Able to read.

proxy: Connected to silcroadg3c3mtu6.onion:6667


You should see a similar sequence of events. At which step does it fail?

783
Security / Re: What email website to use?
« on: July 07, 2013, 10:15 pm »
Another hidden service based email provider is MailTor: bdom5vcb53z5hqz5.onion

Clearnet providers that allow registration over Tor: outlook.com, safe-mail.net

There may be others.

784
Security / Re: Hosting a server for hidden services
« on: July 07, 2013, 09:24 pm »
Some other points.

If you're really worried about the hidden service being identified, you should NOT run it on a VPS. Get a dedicated server. You can find low end dedis for under $50 a month these days, which should be enough to get started. Make sure you remove any backdoors, like ssh keys that the provider puts in root's authorized_keys file.

Use full disk encryption. You will have to request this feature if the provider doesn't have a way for you to load installation images and install the OS yourself.

Never connect to the hidden service directly. Create a separate hidden service for ssh and enable the HiddenServiceAuthorizeClient option in stealth mode.

Rent the server anonymously, whether or not it's disposable. The vast majority of providers don't take bitcoins, so you'll have to anonymize fiat currency. I haven't done that for years and the most popular method back then, Liberty Reserve, is gone, so you'll have to find other ways of bouncing your money through exchanges. Other people are more knowledgeable than me about that and can probably consult you in private.

And if all this is too much, then like Jack said, just get an FH invite. :)

785
Security / Re: Hosting a server for hidden services
« on: July 07, 2013, 09:10 pm »
I know configuring applications to route through Tor can lead to issues like DNS leaks but binding the server to localhost should prevent any data from being sent outside of Tor from the server.

You'd think so, but many things could go wrong to leak your IP. Besides server misconfiguration and hacking / rooting the server, if you are running PHP or other scripting languages, malicious scripts could connect over clearnet to an attacker's server to reveal your IP. The safest configuration is this:

http://dkn255hz262ypmii.onion/index.php?topic=100998.0

But that's too expensive for the vast majority of hidden services. A better alternative is to use VMs or jails to isolate the web server and the Tor client, and route everything from the web server VM through the Tor VM, for example, so even if the server is pwned, the attacker won't find your IP (at least not easily).

Another option is to rent the server anonymously and make frequent backups. This is the "disposable server" option, where you simply drop the server and redeploy elsewhere if it is pwned. :)

Quote
I was planning on going with Apache running on linux for the server unless anyone can recommend a better alternative. I don't have much experience with server configuration yet. Insights and advice would be appreciated, thanks.

Generally, the simpler the application, in this case the web server software, the smaller the attack surface. Apache is a complex web server with a large attack surface. If you can avoid running scripts entirely, then a simple server that only serves static html would be the safest, but Nginx with minimal features would still be better than Apache.

786
Silk Road discussion / Re: New vendor bonds not hedged?
« on: July 07, 2013, 07:46 pm »
That's correct, it isn't hedged.

787
Newbie discussion / Re: Newbie Basic Essentials to Silk Road.
« on: July 07, 2013, 06:33 pm »
There used to be a lot more stickies in each of the subforums. In fact, at one time, a majority of the threads on the first page of Security and Off Topic were stickies, but almost all of them were removed under a new policy that only threads created by mods/admins can be stickied -- except for a few in Shipping, I guess.

788
I have a couple of other layers of protection in place so I normally wouldn't be concerned either way.  But my roommate has recently discovered TOR and doesn't seem to be very aware about such things as internet security....  uses the same computer for everything, doesn't know what a VPN is, etc..  and my only concern with this is because we share the same Internet connection and I'm not sure if the fact that we share an ISP could cause any security problems for me or anyone else who happens to be in the same relay...  due to his lax attitude toward securing his own computer. 

Only if his internet activities cause the ISP to watch your connection or local LE to raid you. Otherwise, what he does on his computer has no effect on your computer, or your security, or your Tor circuits.

Quote
In trying to think it out, it would seem as if one has been on the SR for example, and maybe a couple other hidden services, that if after logging out of any hidden service(s) you've been to and just prior to stopping TOR and exiting the network, you switch identities.  Would that not then put you on a totally different set of relays and make anywhere that you visited completely untraceable even if the exit node logged your IP?  Wouldn't in the case of a malicious exit node attempting to track your activities, switching to a new identity make it appear as if you had been on TOR, went nowhere, then exited?

There is a theoretical attack where an adversary can determine which sites you are visiting by running a large number of relays, then looking for combinations of simultaneous destroy cells in the circuits passing through them. Your Tor client sends destroy cells when you shut it down. So let's say you build three circuits and are currently using one of them to visit web site A. You are unlucky enough to have picked the adversary's nodes once in each of your circuits. kmf has described it a few times on the forum. This is how it looks:

Circuit1 -> entry -> middle -> bad exit -> site A
Circuit2 -> entry -> bad middle -> exit
Circuit3 -> bad entry -> middle -> exit

You shut down your Tor client and send destroy cells to all the relays in those three circuits. The adversary notices that his relays (the "bad" ones above) get destroy cells at about the same time, so he concludes they came from the same client. Based on Circuit3, he knows who you are (your IP address). Circuit2 simply provides more confirmation in this case. Circuit1 tells him which site you are visiting, and that is how you get pwned, because he knows who you are and what you are doing.

If this attack could work at all, considering the large number of Tor clients and continuous circuit destruction that they see, it would mostly work on people visiting clearnet sites. The attacker would have to be both one of your entry guards and one of the hidden service's entry guards in order to identify which hidden service you are visiting, and that is much less likely to be the case than if he was one of entry and exit nodes.

The simple solution to this attack is to do exactly what you described: create a new identity, thus creating new circuits, before you shut down your Tor client. I think it is highly unlikely to happen and probably not worth doing, but that is a matter of academic debate.

789
Security / Re: PGP encryption and talking to vendors HELP
« on: July 07, 2013, 04:02 pm »
Ok, then it's worse than I thought.

Here's what's wrong with this situation.

1. Out of escrow transactions are against SR rules, and if you are caught (which you probably will be at this point), you and the vendor will be banned from SR.

2. Using money orders could get you and the vendor identified, at least more easily than using bitcoins and making transactions on SR. If he asks one of his customers to send a money order to a PO box and that customer turns out to be LE, they will probably be able to identify him. In order to get himself a reduced sentence, he could turn over info about you. It's much less safe than using bitcoins for any type of transaction, and using SR, where the transactions don't exist in the block chain.

3. There is no resolution center when you did this. If you send someone a money order and don't get your package, what are you going to do? You're shit out of luck.

790
Security / Re: PGP encryption and talking to vendors HELP
« on: July 07, 2013, 03:42 pm »
What do you mean by transactions outside of SR? They want you to send payment to a separate bitcoin address, not through the SR escrow system?

791
Security / Tor and state surveillance
« on: July 07, 2013, 01:20 pm »
There's an interesting thread on the tor-talk mailing list, which I thought I'd repost here.

user:

Dear reader,

I'm a Tor user.

My interest in anonymity awoke in response to the European
parliament passing the data retention directive in 2005.  I did (and
still do) not want my ISP to be able to spy on everything I do.
I maintain a German web site explaining how Internet communication
works, warning against data retention, and advertising anonymity via
Tor [1].  I thought that there is not much to lose when using Tor
(except for speed).

Now, I'm about to include a big warning concerning Tor.  Maybe I'm
driven by fear, uncertainty, and doubt.  But I doubt that.  I'd like
to see this e-mail as a consensus check ;)

I'm only talking about Tor users like me, living in a stable
democracy.  In my idealistic (or naive?) view, it's nobody's
business to collect data about me as long as I'm not a suspect of
crime.  If they do anyways, they violate my (perceived) rights,
privacy, and dignity.  I'm using Tor as tool to fight that
violation.  (My reasoning does not apply to people under oppressive
regimes who use Tor as protection from their own government when
they coordinate and communicate and whose physical freedom and
well-being are at risk.)

Of course, since Tor's beginning the threat model has been excluding
global passive adversaries (which are able to observe both ends of
the torified communication) but I didn't consider that a real issue.
However, now I do.

Today, the GCHQ (GB) is running Tempora to spy on all transatlantic
data, including three days of full storage for deeper analysis.  The
NSA (US) is doing all kinds of spying with PRISM, including rumors
of tapping directly into the German Internet eXchange DE-CIX [2].
The DGSE (French foreign intelligence agency) is spying massively on
the French (so much for *foreign* intelligence).  The BND (German
foreign intelligence) is allowed to monitor up to 20% of
border-crossing Internet traffic; supposedly, they are looking at 5%
right now and investing heavily to increase that number [3].

In 2007 Murdoch and ZieliƄski [4] developed traffic analysis
techniques based on sampled data for parties monitoring Internet
eXchanges (IXes).  Apparently, the parties mentioned above have
capabilities that go far beyond the paper's sampling technique.
Thus, I'm assuming that global adversaries are spying on me.

As I said, initially I worried about my ISP under data retention and
considered Tor to be an excellent protection.  Of course, that's
only part of the story as I'd like to restrict who is able to spy on
me as much as possible, whether my ISP, the ordinary criminal, or
our governments's spies.  Frankly, I only started to think about the
last point after seeing the video "Enemies of the State" of last
year's Chaos Communication Congress [5].  There, former NSA
officials complained that the NSA is beating US citizens'
constitutional rights into the dust.  However, the existence of
rights for Non-Americans was not acknowledged, and I wondered how my
expectations should look like given that I'm not protected by the US
constitution.

Now, Tor re-routes traffic on a world-wide basis.  I believe that
without special precautions (I'm going to write a separate e-mail on
that), my communication with the entry node as well as the exit's
with the real communication partner will flow through big pipes and
IXes, which are worth the investment of spying facilities; of
course, terrorism needs to be fought...

Thus, Tor does not anonymize; instead, it turns all my network
traffic over to adversaries.  Hopefully, Tor makes the adversaries'
lives harder, and they need more compute power to spy on me.  Maybe
they find torified traffic more interesting and handle it with
higher priority.  In any case, I assume that torified traffic gets
analyzed.

In contrast, without Tor I'm *not* certain that all my traffic gets
analyzed.  Part of my traffic does not need to flow through big
pipes and IXes but stays in local, untapped regions of the Internet.

Thus, my warning could read as follows:

1. If you are using Tor, you should assume that all your network
traffic gets stored, analyzed, and de-anonymized by intelligence
agencies.

2. If you do not use Tor, you should be aware that your ISP could
spy on all of your network traffic, while part of it (that part
passing tapped IXes) gets stored and analyzed by intelligence
agencies.

Of course, there still is more fun in using Tor.

What's your take on the current situation?  Should the Tor FAQ
include a similar warning?

=========

arma:


1. If you are using Tor, you should assume that all your network
traffic gets stored, analyzed, and de-anonymized by intelligence
agencies.

I don't want to tell you to stop worrying, but depending on how much
you think these intelligence agencies collaborate, I think the "and
de-anonymized" phrase might be overstated. For example, I would not be
surprised if French intelligence doesn't has enough of a reach on the
Internet to be able to break Tor easily -- simply because they haven't
made enough deals with enough backbone providers relative to the locations
of big Tor relays. Maybe they trade data with England and the US, but
then again maybe they don't (or don't trade all of it).

One of the unfortunate properties of the Internet is how it's much less
decentralized than we'd like (and than we used to think). But there are
still quite a few different places that you need to tap in order to have
a good chance of beating a Tor circuit. For background, you might like:

http://freehaven.net/anonbib/#feamster:wpes2004
http://freehaven.net/anonbib/#DBLP:conf:ccs:EdmanS09

and there's a third paper in this chain of research which I'm hoping
the authors will make public soon -- stay tuned.

2. If you do not use Tor, you should be aware that your ISP could
spy on all of your network traffic, while part of it (that part
passing tapped IXes) gets stored and analyzed by intelligence
agencies.

I think you're underestimating the problem here. You say "Part of my
traffic does not need to flow through big pipes and IXes but stays in
local, untapped regions of the Internet." I think for the typical web
user, basically _every single page they visit_ pulls in a component that
goes through these 'big pipes' you refer to.

In short, I think web users are in bad shape using Tor if their adversary
is "every intelligence agency combined", but they're in way way worse
shape when not using Tor.

While I'm at it -- you don't think Deutsche Telekom has a deal with
BND where they hand over all the internal German Internet traffic they
see? I hope the era where people say "My government is doing everything
that has been reported in the news so far, but surely they're not doing
anything else" is finally over, but I guess it will be a while yet.

=========

mp:

It's also important to understand the limitations of these attacks. If
the data they record is low resolution (such as Murdoch's IX sampled
results), the accuracy will be poor.

Murdoch didn't achieve any success at all until several megabytes were
transmitted in a single connection, and even after that, the accuracy
was heavily impacted by the prevalence of similar traffic elsewhere in
the network (due to a phenomenon called the 'base rate fallacy').

As more people use Tor, the better this property gets. In fact, a
Raccoon (when you run an anonymity network, you get all sorts of
interesting characters) proved that the accuracy of dragnet correlation
attacks falls proportional to 1/U^2, where U is the number of concurrent
active users. This creature also pointed out the same property is
visible in Murdoch's own graphs:

http://archives.seul.org/or/dev/Sep-2008/msg00016.html
https://lists.torproject.org/pipermail/tor-talk/2012-March/023592.html

I think this property suggests that with better usability and some
lightweight defenses, Tor can actually do quite well, especially for
relatively small, short transmissions like website loads.

I am worried about the level and duration of timing resolution that
datacenters as large as the NSA one in Utah could provide (assuming that
all that storage is for traffic, and not for stuff like mapping ECC
curves onto Z_p). Even so, I still think protocol-level active attacks
(such as RPW's hidden service Guard discovery attack, and the Raccoon's
bitstomping/tagging attack) are far more likely to be how intelligence
agencies and others will attack Tor:

http://www.ieee-security.org/TC/SP2013/papers/4977a080.pdf
https://lists.torproject.org/pipermail/tor-dev/2012-March/003347.html

792
Yes they have, prices of cocaine have defiantly gone up.....0.8 bitcoins for half a gram is ridiculous!! but that's what it is!

Maybe the bitcoin price has gone up because the listing was pegged to the dollar value.

793
Security / Re: my laptop
« on: July 07, 2013, 09:09 am »
Your login details will be on your hard drive if you personally saved them there. Tor Browser doesn't save your login details to disk. The "Remember passwords for sites" option is even disabled in Preferences -> Security.

794
Security / Re: accessing SR without accepeting cookie?
« on: July 07, 2013, 09:05 am »
My firefox is now set using the above settings, and the security tests look better. However, I recently had the browser uniqueness test on (CLEARNET) https://panopticlick.eff.org/ and the results were not impressive. As you said, because of my settings my browser seems quite unique. So, I read their work on browser fingerprints:

When I look at the individual items in the Panopticlick results, the parameter with the highest entropy for me is browser window size. That (and some other things) can be eliminated by turning off JavaScript. Doing that reduces your anonymity set to those users with JavaScript disabled, but it still makes you less identifiable than giving web sites a unique browser window size. By disabling JavaScript in Tor Browser, I look like 1 in 979 browsers, which is pretty good.

I'm using a Firefox browser on bootable usb stick with Ubuntu to avoid leaking information and not leave traces on my machine.

You're using the regular Firefox? You really should be using Tor Browser and not a regular web browser. Read about all things that Tor Browser protects you against that regular Firefox doesn't:

https://www.torproject.org/projects/torbrowser/design/


Here's a list of patches that make Tor Browser safer than Firefox:

Block Components.interfaces -- In order to reduce fingerprinting, we block access to this interface from content script. Components.interfaces can be used for fingerprinting the platform, OS, and Firebox version, but not much else.

Make Permissions Manager memory only -- This patch exposes a pref 'permissions.memory_only' that properly isolates the permissions manager to memory, which is responsible for all user specified site permissions, as well as stored HSTS policy from visited sites. The pref does successfully clear the permissions manager memory if toggled. It does not need to be set in prefs.js, and can be handled by Torbutton.

Make Intermediate Cert Store memory-only -- The intermediate certificate store records the intermediate SSL certificates the browser has seen to date. Because these intermediate certificates are used by a limited number of domains (and in some cases, only a single domain), the intermediate certificate store can serve as a low-resolution record of browsing history. As an additional design goal, we would like to later alter this patch to allow this information to be cleared from memory. The implementation does not currently allow this.

Add a string-based cacheKey property for domain isolation -- To increase the security of cache isolation and to solve strange and unknown conflicts with OCSP, we had to patch Firefox to provide a cacheDomain cache attribute. We use the url bar FQDN as input to this field.

Block all plugins except flash -- We cannot use the @mozilla.org/extensions/blocklist;1 service, because we actually want to stop plugins from ever entering the browser's process space and/or executing code (for example, AV plugins that collect statistics/analyze URLs, magical toolbars that phone home or "help" the user, Skype buttons that ruin our day, and censorship filters). Hence we rolled our own.

Make content-prefs service memory only -- This patch prevents random URLs from being inserted into content-prefs.sqlite in the profile directory as content prefs change (includes site-zoom and perhaps other site prefs?).

Make Tor Browser exit when not launched from Vidalia -- It turns out that on Windows 7 and later systems, the Taskbar attempts to automatically learn the most frequent apps used by the user, and it recognizes Tor Browser as a separate app from Vidalia. This can cause users to try to launch Tor Browser without Vidalia or a Tor instance running. Worse, the Tor Browser will automatically find their default Firefox profile, and properly connect directly without using Tor. This patch is a simple hack to cause Tor Browser to immediately exit in this case.

Disable SSL Session ID tracking -- This patch is a simple 1-line hack to prevent SSL connections from caching (and then later transmitting) their Session IDs. There was no preference to govern this behavior, so we had to hack it by altering the SSL new connection defaults.

Provide an observer event to close persistent connections -- This patch creates an observer event in the HTTP connection manager to close all keep-alive connections that still happen to be open. This event is emitted by the New Identity button.

Limit Device and System Specific Media Queries -- CSS Media Queries have a fingerprinting capability approaching that of Javascript. This patch causes such Media Queries to evaluate as if the device resolution was equal to the content window resolution.

Limit the number of fonts per document -- Font availability can be queried by CSS and Javascript and is a fingerprinting vector. This patch limits the number of times CSS and Javascript can cause font-family rules to evaluate. Remote @font-face fonts are exempt from the limits imposed by this patch, and remote fonts are given priority over local fonts whenever both appear in the same font-family rule. We do this by explicitly altering the nsRuleNode rule represenation itself to remove the local font families before the rule hits the font renderer.

Rebrand Firefox to Tor Browser -- This patch updates our branding in compliance with Mozilla's trademark policy.

Make Download Manager Memory Only -- This patch prevents disk leaks from the download manager. The original behavior is to write the download history to disk and then delete it, even if you disable download history from your Firefox preferences.

Add DDG and StartPage to Omnibox -- This patch adds DuckDuckGo and StartPage to the Search Box, and sets our default search engine to StartPage. We deployed this patch due to excessive Captchas and complete 403 bans from Google.

Make nsICacheService.EvictEntries() Synchronous -- This patch eliminates a race condition with "New Identity". Without it, cache-based Evercookies survive for up to a minute after clearing the cache on some platforms.

Prevent WebSockets DNS Leak -- This patch prevents a DNS leak when using WebSockets. It also prevents other similar types of DNS leaks.

Randomize HTTP pipeline order and depth -- As an experimental defense against Website Traffic Fingerprinting, we patch the standard HTTP pipelining code to randomize the number of requests in a pipeline, as well as their order.

Emit an observer event to filter the Drag and Drop URL list -- This patch allows us to block external Drag and Drop events from Torbutton. We need to block Drag and Drop because Mac OS and Ubuntu both immediately load any URLs they find in your drag buffer before you even drop them (without using your browser's proxy settings, of course). This can lead to proxy bypass during user activity that is as basic as holding down the mouse button for slightly too long while clicking on an image link.

Add mozIThirdPartyUtil.getFirstPartyURI() -- API This patch provides an API that allows us to more easily isolate identifiers to the URL bar domain.

Add canvas image extraction prompt -- This patch prompts the user before returning canvas image data. Canvas image data can be used to create an extremely stable, high-entropy fingerprint based on the unique rendering behavior of video cards, OpenGL behavior, system fonts, and supporting library versions.

Return client window coordinates for mouse events -- This patch causes mouse events to return coordinates relative to the content window instead of the desktop.

Do not expose physical screen info to window.screen -- This patch causes window.screen to return the display resolution size of the content window instead of the desktop resolution size.

Do not expose system colors to CSS or canvas -- This patch prevents CSS and Javascript from discovering your desktop color scheme and/or theme.

Isolate the Image Cache per url bar domain -- This patch prevents cached images from being used to store third party tracking identifiers.

nsIHTTPChannel.redirectTo() API -- This patch provides HTTPS-Everywhere with an API to perform redirections more securely and without addon conflicts.

Isolate DOM Storage to first party URI -- This patch prevents DOM Storage from being used to store third party tracking identifiers.

Remove "This plugin is disabled" barrier -- This patch removes a barrier that was informing users that plugins were disabled and providing them with a link to enable them. We felt this was poor user experience, especially since the barrier was displayed even for sites with dual Flash+HTML5 video players, such as YouTube.

795
On another note, would it really change anything by using pgp or another type of encryption if it were in fact a vendor selling,posting or whatever? Once it is decrypted for shipping, viola they have it now in plain text to be transferred. Unless I am missing something. Again, it's only if it's a vendor I don't see the protection.     

Nope, PGP doesn't protect you from a rogue vendor. If you take the proper security precautions (buying bitcoins anonymously, encrypting your address), there two unavoidable points where you must trust other people: you trust that the SR admins won't steal your money, and you trust that the vendor will handle your shipping info carefully.

Pages: 1 ... 51 52 [53] 54 55 ... 208