Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - kmfkewm

Pages: 1 ... 67 68 [69] 70 71 ... 249
1021
Security / Re: Brainstorming the ideal anonymity network
« on: June 09, 2013, 03:58 am »
5. How should we obtain our goals?

A. Untraceability

This means that an attacker cannot tell where traffic they observed originated from. Pretty much all anonymity networks put some focus on untraceability because it is required for essentially all other properties that anonymity networks strive for.

B. Unlinkability

This means that an attacker cannot tie multiple actions to the same user.

C. Deniability

This means that an attacker cannot prove beyond a reasonable doubt that a user originally published or intentionally accessed certain published information. The attacker may know that a certain user published certain information, but they cannot determine that they originally or knowingly published it. Likewise, they may know that a certain user requested certain information, but they cannot determine if the user intentionally accessed the published information. This strategy is strongly utilized by Freenet, and to a lesser extent by I2P. Tor is the only one of the networks that puts absolutely no focus on deniability.

D. Membership Concealment

This means that an attacker cannot determine who is using the network. Tor and Freenet both put an emphasis on membership concealment, these days Tor puts a very strong focus on it with the advent of their steganographic bridge links. On the other hand I2P has essentially zero membership concealment, essentially the entire user list of the network is an open book.

E. Censorship resistance / blocking resistance

This means that an attacker cannot prevent users from accessing the network, and also cannot prevent publishers from publishing content. Tor focuses a large amount of effort into preventing attackers from blocking users from accessing the network, but it is currently quite weak to attackers censoring content published through the network. I2P puts essentially no effort into preventing users from being blocked from accessing the network, but it does make it hard to censor services from being accessed through the network (due to multi homed service support). Freenet puts effort into preventing blocking and also does a spectacular job of preventing content from being censored from the network (it is extremely difficult to censor content on freenet).

6. What are some attacks that we know we need to give consideration to?

A. Timing attacks

A timing attack is when an attacker can link packets together through statistical analysis of their arrival times to multiple locations. There are two known ways to prevent timing attacks; mixing can offer very strong defenses from internal and external timing attacks, and plausible deniability via variable length tunnels and forced routing can protect from internal (but probably not external..) timing attacks coming to any certain conclusions. 

B. Byte counting attacks

A byte counting attack follows a flow through the network by counting how many bytes it consists of. The only way to protect from a byte counting attack is by using padding. If all flows are padded to the same size, as is the case with modern remailer networks, then byte counting attacks are impossible. If all flows are padded rounded to the nearest byte, then packet counting attacks become less reliable, as is the case in Tor (where all traffic flows are rounded up to the next multiple of 512 bytes) and I2P (where all traffic flows are rounded up to the next multiple of 1KB). Of course there are two sorts of byte counting attack, counting the bytes of individual packets (easily prevented by padding all packets to the same size) and counting the bytes of individual traffic flows (harder to prevent unless all flows are padded to the same size, accuracy can be reduced with any amount of padding though).

C. Watermarking attacks / tagging attacks

These are less sophisticated than timing attacks but work in a similar fashion. A watermarking attack is when the attacker modifies a traffic stream to make it identifiable at a later point. One way of accomplishing this is by delaying individual packets to embed a detectable interpacket arrival time fingerprint in the flow. Time delayed mixing can protect from watermarking attacks, because the mix gathers all packets prior to forwarding them on, and this removes the embedded watermark. Networks like Tor and I2P are weak to watermarking attacks because the relay nodes forward packets on as they get them, so the interpacket arrival characteristics stay consistent once modified.

D. Intersection attacks

Intersection attacks work by identifying multiple crowds that the target must be in, and then removing all nodes from the suspect crowds that do not appear in all of the suspect crowds. For example, if you can enumerate all of the nodes on a network during the time that the target sends communications to you, you can determine that the target is one of the nodes currently on the network. After doing this many times, you can reduce the size of the suspect list, due to the natural node churn. Intersection attacks have a variety of different manifestations.

E. Traffic identification attacks

Traffic identification is the most trivial of attacks to protect from. If you send traffic through a series of nodes without layer encrypting it, a node can identify traffic it previously routed at a later point simply by looking for the same traffic at a later point. I2P and Tor protect from this attack by using layers of encryption, Freenet does *not* protect from internal traffic identification attacks (only external), but it doesn't really need to because it relies so much on its strong plausible deniability techniques.

F. All of the known mix attacks, like flushing etc

I already explained this previously



Anyway I am a bit tired of typing and I cannot possible summarize all of the things we would need to take into consider anyway, so I will wrap this up with some suggestions.

First of all I think that low latency networks are already covered with Tor and I2P. It is not likely that we are going to be able to make any significant advances to the state of low latency anonymity, and if we were going to it would be by convincing the Tor developers to make some tweaks, not by designing a brand new network. I think that high latency networks are too slow to attract many users, and although they technically can be used for browsing the internet etc, they are too slow to do so. So I think that a variable latency network is the best bet. There is some research already done on this in the context of mix networks, it is called Alpha mixing or Tau mixing. As far as using a mix network goes, this is a bit of a tough call. On the one hand I think mixing is by far the most researched and proven way of providing strong anonymity, on the other hand I would really like to have a P2P anonymity network like I2P, and I would worry that a very large network would dilute the concentration of messages to mix together. Perhaps this can be slightly ameliorated by the utilization of dummy traffic, which would be more realistic on a P2P network with lots of bandwidth.

I definitely think that any new networks should support access to the clearnet. Networks that are only for hidden services simply do not attract as many people as networks that can be used for surfing the regular internet. Additionally, allowing access to the clearnet essentially guarantees a large pool of cover traffic for mixing, and that translates into more anonymity with less time delay. On the other hand, I think that I prefer the freenet strategy of hosting content distributed through out the network. I think that this will encourage more people to actually run hidden services, as they will not need to learn how to configure a server and more importantly they wont need to buy a server in the first place. The primary disadvantage with this is that we will need to create use case specific applications, such as a software package for forums, one for emails, one for blogging, etc. If Tor hidden services have shown us anything, it is that people who want to run hidden service servers don't have the technical expertise required to do so securely. I also like how resistant Freenet hidden services are to DDoS and similar censoring attacks.

I think that deniability is an important aspect that we should definitely utilize. Mixing traffic can protect from timing attacks being carried out, deniability techniques can prevent timing attacks from being used to prove anything after they are carried out. We would primarily be focusing on a fairly medium latency user base, people who want to access sites fast enough to surf the internet, but who require enough anonymity that  they can wait a minute or two. By having variable time delays, traffic of all latencies is given an anonymity advantage, even traffic without any delay at all. This means that just having some people using the network in a high latency fashion, the average user base using it in a medium latency capacity will have increased anonymity. By having time delays at all we will be able to protect some from timing attacks, of course ideally you have multi hour delays to protect from timing attacks, but even 0 seconds to 1 minute per hop should make the network more resistant to timing attacks than Tor or I2P are. Having variable length paths and having all clients route by default will provide plausible deniability as well. All of these things in combination should offer significant protection from timing attacks.

Another thing we need to consider is our padding strategy. It is very easy to pad all packets to the same size and of course we should do this. However, it is also extremely ideal if all traffic flows consist of a single packet. The more padding that is used the more likely it is that an arbitrary webpage can be loaded with a single fixed size packet (ie: if all packets are 1MB, then all webpages 1MB and below can be loaded with a single packet). On the other hand, larger packet sizes leads to inefficient and impossible to scale networks (ie: if all packets are 1MB, then you just spent 1MB * number of routing nodes utilized to send your three byte "lol" message). Perhaps swarming can be used to help hide the size of large messages, or something sort of like I2P's unidirectional tunnels (except it would be more like hydra tunnels).

I am a big fan of layer encrypted networks, and of course for mixing to be utilized layer encryption has to be utilized as well.

Another possibility is using PIR somewhere. The bleeding edge theoretical mix networks use PIR rather than SURB mixing for message retrieval.

1022
Security / Brainstorming the ideal anonymity network
« on: June 09, 2013, 03:58 am »
Right now there are three anonymity networks that are given any respect by the academic community. Of course Tor is the favorite of the academic world, but I2P and Freenet have both also had some professional analysis done on them. If we count remailer networks then we can also include Mixmaster and Mixminion.

Tor was primarily designed for accessing the clearnet anonymously. It separates routing nodes and clients, meaning that being a client and being a relay are not mutually inclusive. Tor primarily derives its anonymity by routing layer encrypted client communications through three different nodes before they make it to the destination server. For the most part, attackers cannot deanonymize a Tor user unless they can watch the users traffic enter the network and exit the network. By having a very large network of volunteer routing nodes, Tor manages to significantly reduce the probability that any attacker will happen to control the entry and exit nodes selected by the client. Tor also pads all packets to the same size, 512 bytes.

I2P was essentially exclusively designed for accessing Eepsites anonymously. 'Eepsite' is the I2P jargon for hidden services. In almost all cases, I2P clients are also I2P relays. I2P derives its anonymity in much the same way as Tor, outgoing communications are layered encrypted and routed through a series of user selected relays prior to reaching their destination. I2P has some key differences from Tor though. For one Tor uses bidirectional tunnels, meaning that forward and return traffic share the same circuit. I2P uses unidirectional tunnels, forward traffic and return traffic are routed through different tunnels. There is an argument that this helps to protect some from internal website fingerprinting, but I doubt it does much. I2P developers also argue that it helps protect from internal timing attacks, but I believe it most likely increases the vulnerability to such attacks. One thing that possibly protects I2P users from internal timing attacks is the fact that virtually all nodes are relays, and tunnels by default can be variable length. This means that an internal attacker watching traffic originate at me and end at the destination may not be able to actually tell that the traffic really originated at me, it is possible that I forwarded it on for someone else. Tor almost always uses three nodes and hardly any clients are also routing nodes, so it does not get this potential protection of I2P. Another distinguishing feature of I2P is that pretty much all of the users can be enumerated in little time, this is also due to the fact that virtually all nodes are routing nodes. This makes I2P particularly weak to intersection attacks, where an attacker monitors who is currently online and looks for correlations between this set of users and the set of hidden services / pseudonyms currently making use of the network. Tor is not particularly weak to this sort of attack, because the entire list of Tor users is not so readily available. I2P also pads all packets to the same size, I believe 1kb.

Freenet is the most different of the three anonymity networks. Whereas Tor focuses on allowing users to access the clearnet anonymously, and whereas I2P focuses on allowing users to access 'Eepsites' anonymously, Freenet focuses on allowing users to publish and access files while maintaining plausible deniability. Freenet also focuses very strongly on being resistant to censorship. I2P and Tor allow clients to anonymize their access to websites, and websites to hide their location from clients. On the other hand, Freenet allows publishers to insert content and clients to retrieve it anonymously. Essentially always Tor hidden services are hosted on single servers, I2P allows multihomed Eepsites but frequently they will be hosted on single servers as well. On the other hand, Freenet content is always redundantly hosted across the Freenet network of volunteer nodes. Freenet is not really for hosting a website like SR, with php etc, it is more like an anonymous file sharing network. Of course, custom client side software can be made to give people the ability to use Freenet for various things, for example there are Freenet email software packages and Freenet forum software packages. As with I2P, virtually all Freenet clients are also Freenet relays, what is unique to Freenet is that essentially all Freenet clients also store data for the entire network. In addition to sharing their bandwidth with the network, Freenet clients also share their hard drive space. Freenet gains its anonymity entirely because of the fact that all clients are also relays, and data can travel through paths of vastly different lengths prior to arriving at its destination. This means that if a client requests a file through their neighboring nodes, the neighboring nodes cannot easily determine if the client submitted the request themselves or if they are just forwarding it on for somebody else. Likewise, the fact that essentially all clients donate drive space to the network, and hold arbitrary files, means that if a content publisher adds a file to the network through their neighboring nodes, the neighboring nodes have a difficult time determining if the publisher originally published the content or if they are just forwarding it on for someone else. Freenet uses two layers of encryption, one layer for the content and one layer for the links. An encrypted file on Freenet looks the same at all positions on the network, but as file chunks are transferred throughout the network they are sent through dynamically encrypted links. Freenet also swams files over many nodes, and pads all file fragments to the same size.

Mixmaster and Mixminion are mix networks and a remailers. Remailer network are used for sending anonymous email messages. Remailer networks usually layer encrypt forward messages and pad all of them to the same size (they must be padded to the same size at every hop as well). They derive their anonymity by mixing messages, ie: time delaying messages until the mix gathers many messages, then reordering the messages prior to sending them out. This technique offers an extremely high degree of anonymity compared to the low latency networks like Tor and I2P. Even if an attacker watches the links between all mix nodes, the sender maintains their anonymity for some volume of messages sent. In fact, a single good mix on a messages path is usually all that is required to maintain anonymity. The academic literature regarding mix networks is vast and I cannot possibly hope to summarize it all here, but there are many different ways of constructing mixes as well as of trying to attack them. For example, one sort of mix network is called threshold mixing. In a threshold mixing scheme, the mix node gathers messages until a threshold is met (say, 100 messages), prior to reordering the messages and forwarding them on. An attack against a threshold mix is called a flushing attack. In a flushing attack, the attacker first waits for the targets message to enter the mix, at which point they flood the mix with their own messages up to the threshold. This forces the mix to send the targeted message prior to establishing a large enough crowd size to hide the message in, because the attackers messages can be filtered out by the attacker. An even more dangerous version of the flushing attack in the n-1 attack. In this case the attacker can delay the targets message, perhaps because they are the ISP of the target or they are the first node utilized by the target. In this case the attacker flushes the subsequent mix entirely, then releases the target message, then flushes the subsequent mix again. This causes the target to have a crowd size of 1 because the attacker manipulated the mix into only mixing the targets message with filterable attacker messages.

Anyway now that I have explained the very basics of the techniques currently being utilized, I want to brainstorm a bit about what the ideal anonymity network would entail. First of all we would need to think about what our goals are:

1. Clients and routers mutually inclusive (as in I2P and Freenet) or not necessarily (as in Tor)?

Networks where all clients are routers have advantages and disadvantages. The most obvious advantage is that they scale fantastically well and can deal with large volumes of bandwidth. The most obvious disadvantage is that not all users are able to route traffic for others. In the case of a network like Tor or I2P, it is very beneficial to have a large network, because the more nodes there are the smaller percentage of the nodes the attacker can control, and therefor the less likely it is that the attacker can monitor a users traffic at the network edges. In the case of mix networks it is actually a disadvantage to have a very large network. This is because the smaller the network is the more concentrated the messages traveling through it are, and therefor the larger crowd sizes messages can be mixed with. Another risk of all clients being routers is the possible added susceptibility to intersection attacks, we would want to think of a way to make it so the entire client list is not trivially enumerated, or else we would open up this risk.

2. For access to the clearnet or for hidden services?

Allowing access to the clearnet has advantages and disadvantages. Two of the primary disadvantages are that for one the exit node utilized is always going to be capable of spying on the users traffic (although it could be user encrypted), and for two the exit nodes are always going to be abused. The advantages of allowing exiting to the clearnet are that a lot more people will use the network, a lot more traffic will be routed through the network and a much more diverse group of users will use the network. 

3. For services on which content is put, or for content upon which services are built?

In the case of I2P and Tor, anonymity is provided to servers and content can be put on the servers. In the case of Freenet deniability is provided to access to raw data, and any services that utilize Freenet need to be custom designed to work with the raw data retrieved through or published to Freenet. Both of these strategies have advnatages and disadvantages. In the case of systems such as Tor and I2P the advantage is that people are already accustomed to running services on top of servers, and there is already an enormous library of php scripts and such that can be run on anonymized servers. It takes a lot more work to provide a service on top of freenet. On the other hand, in the case of Freenet the security is arguably increased as people can move away from massively bloated all purpose browsers and to task specific bare-bone programs. Additionally, in the case of I2P and Tor, content is vulnerable to hackers who can penetrate the server that it is hosted on, in the case of Freenet the security of content is mostly dependent on the security of Freenet which is likely to itself be more secure than most newbies can ever hope to secure their own servers. Freenet is also essentially immune to content being censored or DDoSed because it is spread through potentially thousands of different servers around the world.

4. High or low latency? Maybe variable latency??

Of course there are advantages and disadvantages to high and low latency networks. Low latency networks are snappy, like Tor and I2P. You can use them for browsing the internet and, although they feel a bit sluggish, they offer more or less the same experience as surfing the internet without an anonymity network at all. Of course the down side to low latency networks is that they are severely crippled in the anonymity department. In the case of Tor you are certainly only safe until your attacker can watch a single packet originate at you and arrive at your destination. I2P might be a bit more resistant to this because of all clients also being routers + variable length tunnels, but it is still weak to a variety of attacks. On the other hand, mix networks are slow. The current mix networks can take hours and hours to deliver a single message. They are also unreliable because of this; as nodes go down before messages reach them, the messages are dropped and never arrive. I2P and Tor are so fast that reliability is much higher, your traffic goes from the first node you select all the way to your destination site in a matter of seconds. On the other hand, mix networks don't necessarily need to be slow, the only reason they are slow is so that each mix can gather an adequate amount of messages prior to reordering and firing. On a very heavily used mix network, very small time delays would be required to gather an adequate crowd size. On the plus side mix networks offer significant anonymity, even if an attacker can observe all links between mix nodes and the internal state of all but one mix node, some level of anonymity is still maintained.


1023
Security / Re: clearnet with Tor- bad?
« on: June 09, 2013, 12:18 am »
I think in either case little is gained or lost. Enough people disable javascript that the set size is still going to be significant, and if somebody is skilled enough to hack you with javascript they can probably hack you without it as well. I choose to disable javascript primarily because I am more concerned with having a slightly more hardened browser than I am with blending into a larger crowd. I don't really care if they can determine that there is a 1:50,000 chance that kmfkewm visited some other site, versus a 1:500,000 chance. I would rather make life a little bit more difficult for the person who tries to root me.

1024
Security / Re: clearnet with Tor- bad?
« on: June 08, 2013, 11:51 pm »
Tor Project is *obsessed* with linkability, they focus disproportionately on preventing linkability attacks. Traceability has always been a secondary issue for them. Browser fingerprinting is a trivial sort of linking attack and disabling javascript makes it substantially more effective (although the practical implications of this are debatable). Hacking somebodies browser with malicious javascript is an advanced sort of attack that can lead to tracing in addition to linking

Tor Project ships TorBrowser with JavaScript enabled not because they don't care about traceability or people getting hacked, but because disabling JavaScript would break a lot of clearnet sites, and most Tor users wouldn't know that they can whitelist domains or turn off NoScript. They would think that TorBrowser is broken and stop using it. The Tor devs surmise that using Tor with JavaScript is better than not using Tor at all.

That being said, the Tor devs put NoScript in TorBrowser and it's easy to turn on if you're worried about JavaScript attacks.

What you said is true, but the Tor devs also tell people that they should leave javascript on because if they turn it off they will stick out from everybody who used the default settings. If you are mostly concerned with browser fingerprinting leading to linkability then this is good advice, if you are mostly concerned about somebody hacking you then it is bad advice. They assume that everybody is concerned primarily with linkability.

1025
Security / Re: clearnet with Tor- bad?
« on: June 08, 2013, 11:06 pm »
Tor was originally designed for surfing the clearnet anonymously. That has always been its focus. Hidden services were added later as a proof of concept, they are not and they have never been the primary focus of Tor. Some anonymity networks were designed with hidden services in mind, for example I2P is like the inverse of Tor in that it was originally designed for hidden services and the ability to exit was layered on to it later (I think it has like two user added exits). Freenet is another network with more of a focus on hidden services, it doesn't even have the ability to exit to clearnet. So it is kind of ridiculous to see people posting clearnet warnings, considering the fact that Tor has always been the anonymity network designed with clearnet in mind at every single step of its development.

You can open multiple tabs in your browser at the same time, clearnet and hidden services can both be accessed at the same time. You don't even need to avoid sites that require login, you only need to avoid sites that can link you to your real identity via your login. For example, if I go and register on a clearnet site with Tor it is fine for me to access it with Tor. Of course I am no longer anonymous because I have logged in, but I am still untraceable. The thing is that there are all kinds of different aspects to an anonymity network.

Anonymity:

Means that you are without a name. In a technical sense it means that you blend into a crowd of other people who all have exactly the same identifying characteristics. This crowd of people is called your anonymity set size. When you are on the internet you are never truly nameless, your browser has a lot of identifying information associated with it, the fact that you use Tor in itself means that you are somebody using Tor, etc. The best you can hope for is to use a browser that is the same as a lot of other people are using, using a network that a lot of other people are using, etc. This gives you a large anonymity set size, even though you are pseudonymous by the data points you reveal about yourself via browser etc, you are using the same pseudonym as so many other people that you are anonymous in the traffic analysis sense of the term.

Pseudonymous:

Means with a fake name. Technically you are always pseudonymous on the internet, but if you use the same pseudonym as a lot of other people then you have an anonymity set size. If you have a large anonymity set size you are referred to as being anonymous, even though you are still pseudonymous in the purest sense of the word. The way that I prefer to use pseudonymous is when your anonymity set size consists of 1. For example, when I browse SR without logging in, I blend in with everybody else using the same browser configuration as I am (of course there are other ways that anonymity can be broken, but in general). This means that my anonymity set size is roughly equal to the number of people who can not be technically distinguished from me. When I login to SR I am given the name kmfkewm, and now my anonymity set size falls to 1 so I am essentially pseudonymous.


Now Tor is a network that focuses on allowing people to maintain their anonymity. Using a pseudonym is the surest way to not actually get the anonymity that Tor offers you. But thankfully Tor also offers a variety of other things.

unlinkability: Is the property of an adversary not being able to associate two items of interest with each other. For example, if I publish a book with one pseudonym and another book with another pseudonym, ignoring writeprint analysis, I can assume that the two books are unlinkable. Tor offers some level of unlinkability because circuits rotate approximately once every ten minutes. Ideally, traffic sent down one circuit cannot be linked to traffic sent down another circuit. Of course when you are pseudonymous (set size = 1) all of your traffic can be linked together, because your pseudonym is a datapoint that links the traffic. My posts here as kmfkewm can all be linked to the same person, if the forum allowed for anonymous posting with the username 'anonymous', then posts I make anonymously would not be linkable to the same poster (between circuit rotation anyway, although in all cases for someone who doesn't own the server).

untraceability: Is the property of an adversary not being able to identify the location of someone who they see traffic from. In a sense untraceability is unlinkability between a publisher and the item they publish (however, generally unlinkability is used to describe the relationship between two published items, and untraceability is used to describe the relationship between the publisher and the published item). For example, if I publish a book under my real name, but I mail it to my publisher with a fake return address and I never let on to where I live, I am not traceable. In the context of anonymity networks, somebody who is always traceable can always have their sessions linked together, but somebody who can always have their sessions linked together is not always traceable.

We are primarily worried about maintaining our untraceability. Tor is more focused on maintaining unlinkability, although in recent years they have started to be more balanced. In the past they rotated circuits every thirty seconds, which is great for unlinkability but significantly speeds up the rate at which a trace can be carried out. Anonymity and unlinkability go hand in hand, if you are anonymous then your sessions are inherently unlinkable, if you are not anonymous then inherently all of your sessions are linkable. Tor is an anonymity network and so of course their primary focus is unlinkability. For people who are not worried about anonymity as much as they are untraceability, it might not matter too much if your browser fingerprint is part of a smaller anonymity set size. This is especially the case if by making your browser fingerprint part of a smaller set size, you are also hardening yourself from hackers.

The risk of logging into clearnet sites with Tor is that if you login to a site like facebook, then obviously it can identify you because it knows who you are. During the time that you are connected to facebook, all of the connections going through the circuit that you use to connect to facebook will therefor be linkable to your real identity. If you use the same circuit to visit facebook that you use to visit an illegal website, then your real identity is linkable to the illegal website by the exit node.

Another risk of clearnet websites is that the exit node can spy on any non-encrypted traffic that you send. Tor to the clearnet is strictly for anonymity, it is not for privacy. Technically speaking privacy generally means that what you say cannot be read by unwelcome third parties, and anonymity again means that you blend into a set size.

1026
First off..  I am NOT buying the ricin story.  That just screams manufactured fearmongering.


Second, lets just say that they do in fact snap pics of our mail..  If they are JUST putting this out on MMM now, then it's been going on for many years already.

At least the fact that they record the addresses on all mail has been known for years. They use OCR to record all shipping and return addresses. 

1027
Systems like this probably explain why when there is a bulk interception, frequently other bulk items from the same vendor are intercepted as well. The lesson to learn from this is that vendors should space out their shipments as much as possible, rather than dumping all of them in a single outgoing box.

1028
Security / Re: Search engine
« on: June 08, 2013, 01:54 am »
Sounds good to me just so long as they don't block Tor. Google can go fuck themselves.

1029
Security / Re: Search engine
« on: June 08, 2013, 01:47 am »
I use Bing now. It has way better search results than DDG or startpage, and it lets me use Tor with it. Google lost a customer by blocking Tor.

1030
Security / Re: clearnet with Tor- bad?
« on: June 08, 2013, 01:28 am »
I should also add that by Tor projects definition of 'anonymous' (which is the technically correct definition, mind you), none of us are anonymous anyway, we are all pseudonymous. Turning off javascript reduces your anonymity in the sense that you can now be identified by the 'pseudonym' that is your browser fingerprint, which is going to be a less used 'pseudonym' than the browser fingerprint in which javascript is enabled. However, this is mostly only important if you use Tor to access all kinds of sites. When I am posting on SR, the fact that I am named 'kmfkewm' already removes all of my anonymity, 'kmfkewm' is a more important pseudonym than the pseudonym I have due to my browser fingerprint. However, if I surf SR as well as other sites with this browser, then the 'pseudonym' of my browser fingerprint becomes more important, and server logs could indicate that somebody who shares my not-kmfkewm 'browser pseudonym' has surfed several different websites.

But even though I am not anonymous when I surf SR, I still don't want to be traceable. Not having javascript enabled makes me less traceable, having javascript enabled makes me potentially more anonymous (although not if I am posting as kmfkewm, and not if I am hacked!) but it also makes me more traceable.

1031
Security / Re: clearnet with Tor- bad?
« on: June 08, 2013, 01:20 am »
It is good technique to visit clearnet sites with TOR.  Don't login to personal accounts though and leave java off.
Disabling javascript actually makes you less anonymous according to the tor project, most exploits are taken care off by leaving it by default. My impression was that it was more secure to leave noscript off.

Tor Project argues that javascript should be left on due to the fact that most people leave javascript on. If you turn javascript off, your browser fingerprint is now much more identifiable, as you only blend in with the people who have turned javascript off. I argue that turning javascript off makes you more anonymous, because browser attacks that require javascript will no longer work against you. It comes down to a trade off between browser fingerprint crowd size and browser hardening against hackers.

+1  That is exactly what I said except I said it in layman's terms.

I know I just wanted to lend support to your opinion.

Tor Project is *obsessed* with linkability, they focus disproportionately on preventing linkability attacks. Traceability has always been a secondary issue for them. Browser fingerprinting is a trivial sort of linking attack and disabling javascript makes it substantially more effective (although the practical implications of this are debatable). Hacking somebodies browser with malicious javascript is an advanced sort of attack that can lead to tracing in addition to linking (in addition to communications security compromise, stored data compromise, and essentially total compromise of the entire system).

1032
Security / Re: clearnet with Tor- bad?
« on: June 08, 2013, 01:15 am »
Javascript off = browser fingerprinting attacks can link your sessions together with more accuracy, the crowd of browsers that share the fingerprint of your browser is much smaller. This can theoretically lead to linkability attacks, but in practice enough people have javascript disabled that your web surfing is not going to stick out like a sore thumb. This sort of attack cannot be directly used to actually trace you and determine your real IP address, it is only for determining the probability that the person who visited site A also visited site B.

Javascript On = Browser fingerprinting attacks have their accuracy substantially reduced, as now your browser blends into the much larger crowd of browsers that leave javascript enabled.

___

Javascript On = You increase the attack surface area of your browser, hackers can use malicious javascript embedded in websites to try to take over your browser and even root your system. Not all attacks require javascript to be enabled, but a substantial portion of them do. Disabling javascript makes it harder for a remote attacker to hack into your system through weaknesses in your browser. If an attacker does successfully pwn your browser, it will be a totally deanonymizing attack unless you used one of various isolation techniques (whonix, host only routing with a VM, mandatory access controls, etc).

Javascript Off = You can still be hacked through your browser, but you significantly reduce the risk of this happening.

1033
Security / Re: clearnet with Tor- bad?
« on: June 08, 2013, 01:07 am »
It is good technique to visit clearnet sites with TOR.  Don't login to personal accounts though and leave java off.
Disabling javascript actually makes you less anonymous according to the tor project, most exploits are taken care off by leaving it by default. My impression was that it was more secure to leave noscript off.

Tor Project argues that javascript should be left on due to the fact that most people leave javascript on. If you turn javascript off, your browser fingerprint is now much more identifiable, as you only blend in with the people who have turned javascript off. I argue that turning javascript off makes you more anonymous, because browser attacks that require javascript will no longer work against you. It comes down to a trade off between browser fingerprint crowd size and browser hardening against hackers.

1034
Of all the 3 letter agencies that operate in the United States, the NSA has always scared the shit out of me the most. In so many ways I feel as if they are the most shadowy organization in the US, moreso even than the CIA.

QFT. This comes as no surprise unfortunately after Tripwire, Utah Data Center, etc. NSA essentially has carte blanche to do anything they want surveillence wise in the name of "Foreign Intelligence". Like a page right out of 1984.

The NSA has always scared the shit out of me the least, because I know that I am not targeted by them. I am much more concerned with the DEA and FBI.

1035
Pretty much it reduces to this:

if a% of b == c, and if d% of e == b, then a% of d% of e == c

10% of 100 = 10, 10% of 1000 = 100, 10% of 10% of 1000 = 10

if 90% of autistic people are atheists, and if 10% of people with a certain gene are autistic, then at least 90% of 10% of people with that gene are atheists.

I can tell that I am correct in purely mathematical sense but perhaps it doesn't apply to non-mathematical reality? For example:

if 10% of cats are brown, and if 10% of my animals are cats, then at least 10% of 10% of my animals are brown.

I imagine that probabilistically this would be true, but it is still entirely possible for all of my cats to be white.

Pages: 1 ... 67 68 [69] 70 71 ... 249