• Content count

  • Joined

  • Last visited

  • Days Won


Posts posted by lattera

  1. Hey guys I was wondering if any one had a spare MSDN admin account I could have please, I would very much appriciated!

    Pm me if you have one please,

    Or add me to MSN:

    Thanks a ton in advance!

    what is it that you're trying to find? Im willing to bet no one is going to give up their password for their msdn account

    I need a Admin MSDN account. I will buy if need be.


    Thread closed and account banned due to illegal activity and requests thereof.


  2. Virtualization is great for developers. It allows us to test different scenarios, keep organized, and maintain a safe and sane environment. I only develop inside VMs. I hate cluttering my main OS install with non-production-ready code, especially if I'm dealing with touchy things like the kernel. Virtualization in the enterprise allows for server consolidation, cloud hosting, failsafes, etc. I use virtualization heavily at work. I use multiple computers and multiple VMs on each computer for a vuln-dev lab. If virtualization wasn't an option, My employer would have to provide me with over ten servers if virtualization technology didn't exist.

    However, virtualization isn't the end-all-be-all solution. Sometimes you need to test your project on real hardware or in real-life situations. As with all decisions, evaluate your needs and see if virtualization is a good option.


  3. I tend to use the OS that fits the job best. On my laptop, I run OSX. On my workstation at work, I use Solaris 11 Express. On my vuln-dev lab, I use a mixture of Linux, Windows, and Solaris. I'm more biased towards Solaris because of ZFS, Dtrace, Xen, and Crossbow.


  4. Hacking is very much alive. Take a look at full-disclosure. Take a look at the industry. I would be considered a whitehat hacker--I get paid to hack (legally, of course). I think you just need to know the scene. The scene is much broader these days, encompassing groups of script-kiddies who somehow get their hands on 0days to very talented individuals. You'll find varying degrees of expertise and maturity in all hacking communities. It's definitely hard to pinpoint a definition of hacking. Is it merely finding vulnerabilities and writing exploits? Is it using developed exploits against others for profit or fame? Is it limited to the digital world? I'll leave the definition up to you; but suffice it to say that whatever hacking is, it isn't dead.


  5. Here is the final shell script:


    function snapshot {
    echo [*] Snapshotting $1 @ $2
    echo zfs snapshot tank/shares/cifs/CompanyData/$1@$2
    echo " [+] Snapshot of $1 @ $2 done"

    function sync {
    echo [*] Syncing $1 with snapshot...
    echo rsync -aA /tank/shares/cifs/CompanyData/$2/.zfs/snapshot/$3/ /tank/shares/cifs/CompanyData/$1/
    echo " [+] Syncing $1 done!"

    if [ $# -eq 0 ]; then
    DATE=`date '+%F_%T'`

    snapshot Prod $DATE
    sync Dev Prod $DATE
    sync Alpha Prod $DATE
    elif [ $# -eq 1 ]; then
    DATE=`date '+%F_%T'`

    snapshot Prod $DATE
    sync $1 Prod $DATE
    elif [ $# -eq 2 ]; then
    DATE=`date '+%F_%T'`

    snapshot $2 $DATE
    sync $1 $2 $DATE


  6. My employer's flagship product generates thousands of PDFs. We have three different copies of our product: a development copy for Quality Assurance testing of recently-written code, a staging copy to test pushing a new versions of our product, and a production copy that our users utilize. Each copy requires 42GB worth of PDFs. New PDFs are generated every day.

    To maintain a sane development environment, we pull fresh copies of the PDFs every month from production. We pull from production to staging, then from production to development. The PDFs are stored on two separate servers that both run NTFS. We use Microsoft SyncToy to sync the PDFs across environments. The process can take several hours for each environment. The network load is high due to the PDFs being stored on multiple servers.

    I recently had an idea. What if we store the PDFs on our ZFS NAS? We could use ZFS snapshotting and rsync to refresh the environments. We can do that on a regular basis via a cron job. ZFS snapshots take a few seconds and rsync is a really efficient tool. No network traffic will be involved since the synchronization is taking place all on the same server.

    Here are the commands we would run:

    DATE=`date '+%F_%T'`
    zfs snapshot tank/site_data/prod/PDFs@$DATE
    rsync -a /tank/site_data/prod/PDFs/.zfs/snapshot/$DATE/ /tank/site_data/dev/PDFs/

    I really like this solution. Right now, we have to jump through a lot of hoops to sync up these PDFs. This will save us time, space, and internal bandwidth.

    This article originally posted on my tech blog.


  7. I just looked at it with dtrace. I'll take a look at it with ltrace on a linux system this week sometime. I need to spin up a linux VM.

    shawn@shawn-work:~$ uname -a
    SunOS shawn-work 5.11 oi_147 i86pc i386 i86pc Solaris
    shawn@shawn-work:~$ cat test.d
    shawn@shawn-work:~$ pfexec dtrace -s test.d -c ./test
    dtrace: script 'test.d' matched 1 probe
    16 60 58 83 23 49 19 96 72 70 49 92 5 47 60
    dtrace: pid 21456 has exited
    0 71978 srand:entry 1289239423

    edit[0]: I know that showing dtrace doesn't help much in your case, but I thought I'd take the time to market it anyways. ;)


  8. Question for people who do contribute to open source projects.

    In one my classes we have been tasked with developing a new feature for a current open source project. I'm not going to state which project it is (privacy/doc dropping) but in this case, all there is, is the SVN repo, and hardly any documentation at all, infact pretty much none except for the end user instructions.

    Is this often the case? I currently find my self with a massive project in Visual Studio, and thats about it. It's kinda overwhelming to be honest.


    It really depends. For me, if the project is just some hobbyist tool meant to solve a little problem, I likely won't write documentation. I'll let the code document itself. However, if the project is meant to be more serious, I'd document it both inside the source through comments and through API documentation. If the code is obscure, but meant to be reused within a few years, I'll likely just comment the code.


  9. I'm sorry I haven't keep BinRev up-to-date with this. I'm getting married in December. I haven't had time to do much other than plan for the wedding/reception and pick up a couple extra jobs to pay for some really expensive medical bills. Hopefully we'll restart BinRev Radio yet again early on in 2011. If someone would like to make a show, though, I'd be happy to post it. BinRev is a community, and as such can be controlled by the community. Let me know if you're interested in hosting.


  10. Here's my results:

    95% Zenwalk (a distro I've never heard of)

    95% Slackware

    95% Gentoo

    I'm ashamed Gentoo is on my results. I'm more of an OpenSolaris or FreeBSD guy. I run FreeBSD 8-STABLE at work and OpenSolaris and OSX at home. I don't have a single Windows or Linux box on my network, a fact I'm very proud of.


  11. If the box has firewire, then you can use the existing tools to give you administrator access via a firewire exploit. Firewire spec mandates Direct Memory Access (DMA), which means that any firewire device has full access to all physical memory.


  12. Well, It's not just about an extra cost for security. It's about knowing you, yourself is secure, along with your files and such that are of importance.

    I'm not sure about some of you, but there are certain things I would just really like to be kept 100% private. Even the most simple text encryption could help you in a way.

    If you think you aren't vulnerable to attacks as such as exploits, then you are highly mistaken. It only takes going to 1 site to get exploited, and the thing is, you will never even know it.

    Like I said, It's about knowing you are safe, secure and protected.

    I think the underlying ideology you're starting to argue can be turned into a broad debate. I don't want this thread to turn that way, so I'll say only what time has proven: security is a tradeoff, usually one that involves time and money. Risk analysis can be done to determine the seriousness of the weaknesses. That's what I'm in charge of at work: finding vulnerabilities, classifying them by seriousness, and making cost-effective suggestions. Management may approve or disapprove based on budgetary constraints or otherwise. If Seal thinks his current setup correctly handles the risk he's willing to take, then he'll continue what he's doing. Remember that there is not a single system (and, being networked, neither Seals' nor your systems are part of a complex system) that is 100% secure.

    That being said, I'd prefer we stick to the topic of encryption. What items do we choose to encrypt? What do we let go?


  13. Cryptography is a wonderful tool. It has its uses in many different areas. Full-disk encryption helps protect against offline attacks. IM encryption, if done right, helps protect against Man-In-The-Middle (MitM) attacks and eavesdropping. However, cryptography isn't meant to be the end-all-be-all of security. With any piece of data, humans are involved. Humans are the weakest link. We can secure our systems, data, and networks by using sophisticated tools including encryption. But even the best digital security practices can be easily foiled by a simple phone call.

    We use encryption at work for certain pieces of data and certain protocols. We use it to secure our VPN traffic. We use it to secure IMs. When IllumOS supports ZFS encryption, I will definitely make use of it.

    Overall, encryption is a great resource. But don't mistake it for being the solution to all security-related issues.